In this project, our objective is to study the synchronization issues (both carrier phase synchronization and timing recovery) for digital communication systems. We will consider a number of specific problems including the use of the ML synchronizer and decision directed carrier phase estimation.
The project is designed as discrete time simulation of several continuous time systems. To accomplish this you will need to divide the time axis into small increments, and consider the increase of time with the selected time step size.
Part 1: Symbol Synchronization for Baseband Signals
In this part, we consider a baseband binary PAM system, where the transmitted signal is modeled as
where 𝑝(𝑡) is the pulse shape, 𝑎𝑖 is the information symbol (which is +1 or -1 with equal probabilities), T is the symbol (bit) duration, and M is the number of symbols. The pulse is given by p(t)=1 for 0 ≤ 𝑡 ≤ 𝑇 and 𝑝(𝑡) = 0 otherwise, where T=1 microsecond.
Since we have a baseband signal (no high-frequency carrier component), there is no problem of carrier phase estimation. We should just perform symbol timing in this case. Suppose that the received signal is modeled by
𝑟(𝑡) = 𝑠(𝑡 − 𝜏) + 𝑛(𝑡)
where 𝜏 is the time delay (timing offset), and 𝑛(𝑡) is zero-mean white Gaussian process with a spectral density level of N0/2. Assume that the time delay is limited to half a symbol duration; namely, −0.25𝑇 ≤ 𝜏 ≤ 0.25𝑇 (that is, a coarse estimation has already been performed).
The aim is to perform symbol synchronization (timing recovery) based on the maximum likelihood (ML) method. To that aim, implement the following steps in
MATLAB:
• Generate 𝑠(𝑡) as described above for M=100. Plot the first 10 symbols in your report.
• Generate 𝑟(𝑡) as described above for a value of 𝜏, which generated randomly in [-0.25T, 0.25T]. Also, set N0=1 for the noise component. Write down the value of 𝜏 in your report.
• Pass r(t) through a matched filter with impulse response 𝑝(−𝑡). Let y(t) denote the output of that matched filter. Plot y(t) for about 10 symbol durations in your report.
• Suppose that 𝑎𝑖’s are known (training bits). Find the value of the time delay (call it 𝜏̂) that maximizes the likelihood function.
• Repeat the previous experiment at least 1000 times for independently generated values of 𝜏 and independent noise realizations, and calculate the mean-squared error (MSE); that is, the average of the squares of the differences between actual and estimated time delays.
• Now repeat all the previous experiments for different noise levels by increasing N0 and plot the MSE versus N0. Choose some reasonable values for N0 so that the high error and low error regions can be observed in the figure.
• Choose one value for N0, obtain the MSEs for various values of M, and plot the MSE versus M. (Choose some reasonable values for M (about six values) so that the high error and low error regions can be observed in the figure.)
Provide an overall summary and discussion of all your results.
Part 2: Decision Directed Carrier Phase Estimation
In this part, we consider a bandpass system, where the transmitted signal is modeled as
where fc=0.36 MHz, 𝑝(𝑡) is the pulse shape, 𝑎𝑖 is the information symbol (which is +1 or -1 with equal probabilities), T is the symbol (bit) duration, and M is the number of symbols. You can choose any suitable pulse shape in this part, which is nonzero only in [0,T]. Assuming that the symbol synchronization has been performed with sufficient accuracy, the received signal is expressed as
where 𝜙 is the unknown carrier phase and 𝑛(𝑡) is zero-mean white Gaussian process with a spectral density level of N0/2.
The aim is to perform decision-directed estimation of the carrier phase. To that aim, implement the following steps:
• Choose some values for N0 and T (𝑇 ≥ 1ms), and let M=50.
• Generate random information symbols.
• Generate 𝑟(𝑡), and perform downconversion to obtain its complex envelope (baseband representation). Let 𝑟𝑙(𝑡) denote the complex envelope of 𝑟(𝑡). Perform correlations of 𝑟𝑙(𝑡) and 𝑝(𝑡) for each symbol; that is calculate
for i=0,1,…,M-1.
• Assuming that the information symbols are known by the receiver (i.e., we are in the training phase), calculate the maximum likelihood (ML) estimate of the carrier phase.
• Repeat this experiment 500 times for random generations of the noise component and the information symbols. Calculate the MSE for estimating the carrier phase.
• Repeat the previous steps for various noise levels and plot the MSE versus N0. Choose some reasonable values for N0 so that the high error and low error regions can be observed in the figure.
• Fix a value of N0 and repeat the previous steps for various values of M. Plot the MSE versus M.
Provide an overall summary and discussion of your results.
Some Notes on Technical Requirements
Note that you are not allowed to use the built in functions from Matlab (or, other resources) to complete the project (except for MATLAB’s basic functions). You must write your own algorithm, and conduct your simulations using that code.
Both the Matlab files and the project reports will be processed by turnitin. In addition, the m files will be checked via the Moss software.
Finally, note that what is important to clearly demonstrate that you have worked on the project for a sufficient amount of time, and have done your own work. It is not as important to have all the pieces completed correctly or thoroughly.
Reporting Requirements
Your report should contain all the relevant information about the set-up used, results obtained and your comments on the results. Please also include your MATLAB code inside your report either as an appendix or in the related parts. The specific format is up to you, but please make sure to properly label each figure, include relevant captions, point to the right results in your explanations, etc. It should include a title page, brief introduction and outline as well as any references used. The references used should be cited within the report wherever they are used.