Aim
To implement a decimator and interpolator for a given signal using MATLAB, with decimation and interpolation factors of 2.
Problem Statement
Given the signal:
x(n) = 10 * cos(2 * pi * fm * n * Ts)
Where fm = 200Hz, fs = 5000Hz. Perform the decimation and interpolation process with factors of M = 2 and L = 2.
Theory
Decimation and interpolation are two operations used in signal processing to reduce or increase the sampling rate of a signal.
- Decimation: This process involves reducing the sampling rate by an integer factor. For decimation by a factor of M, every Mth sample of the signal is taken.
- Interpolation: This process involves increasing the sampling rate by an integer factor. For interpolation by a factor of L, new samples are inserted between the original samples.
In this experiment, we will apply both operations (with M = L = 2) on the given signal and observe the effects.
MATLAB Code
Expected Output
The MATLAB code generates three plots:
- Original Signal: The first plot shows the original signal x(n), which is a cosine wave with a frequency of 200 Hz.
- Decimated Signal: The second plot shows the signal after decimation by a factor of 2, effectively reducing the number of samples in the signal.
- Interpolated Signal: The third plot shows the signal after interpolation by a factor of 2, effectively increasing the number of samples between the original ones.
These plots visually demonstrate the effect of decimation and interpolation on a discrete-time signal. Decimation reduces the sample rate, while interpolation increases it by inserting additional samples between the original ones.