DSP Practical - Experiment 5

To write and perform a MATLAB program to implement decimator and interpolator

Aim

To implement a decimator and interpolator for a given signal using MATLAB, with decimation and interpolation factors of 2.

Problem Statement

Given the signal:

          x(n) = 10 * cos(2 * pi * fm * n * Ts)
        

Where fm = 200Hz, fs = 5000Hz. Perform the decimation and interpolation process with factors of M = 2 and L = 2.

Theory

Decimation and interpolation are two operations used in signal processing to reduce or increase the sampling rate of a signal.

In this experiment, we will apply both operations (with M = L = 2) on the given signal and observe the effects.

MATLAB Code

Expected Output

The MATLAB code generates three plots:

  1. Original Signal: The first plot shows the original signal x(n), which is a cosine wave with a frequency of 200 Hz.
  2. Decimated Signal: The second plot shows the signal after decimation by a factor of 2, effectively reducing the number of samples in the signal.
  3. Interpolated Signal: The third plot shows the signal after interpolation by a factor of 2, effectively increasing the number of samples between the original ones.

These plots visually demonstrate the effect of decimation and interpolation on a discrete-time signal. Decimation reduces the sample rate, while interpolation increases it by inserting additional samples between the original ones.