Starting from:

$50

CS2105 -  Assignment 3 - Solved

Objectives
In this assignment, you will inspect three multimedia streaming applications that run on DASH, RTP, and WebRTC and discuss some important aspects that characterise these popular streaming standards. After completing this assignment, you should

•     have a good understanding of how DASH, RTP and WebRTC work in real-life applications

•     know how to use common network inspection tools for simple network debugging tasks

•     know how to use FFmpeg for simple media processing.

This assignment is worth 5 marks in total and they are split across three exercises. All the work in this assignment shall be completed individually.


Question & Answer
We
will
not
de
bug
pro
grams
for
you.
If you have any doubts about this assignment, please post your questions on Piazza forum before consulting the teaching team.

However, we will help to clarify misconceptions or give necessary directions if required.

Setup
should
not
You should use your local machine for all tasks in this assignment. Note that you ssh into the sunfire server for this assignment. The tools you need (on

your local machine) are -

1.     Google Chrome or any other modern browser

2.     Wireshark

3.     FFmpeg

For Exercise 3, you are also required to activate an account (24 hours needed for activation) for YouTube Live.



Figure 2: Add filters in Wireshark.

Task 4: Simulating the Network for Adaptive Streaming (0.5 mark)
An important aspect of DASH is its adaptive streaming feature and now let’s see it in action. For this task, refresh the web page of our DASH player and “Load" the default MPD playlist instead (as it contains more quality levels for better visualization), which should look like -

https://dash.akamaized.net/akamai/bbb_30fps/bbb_30fps.mpd

Next, we simulate a drop in network speed by using the browser’s network throttling feature. Open the browser’s network inspection tool again and select a slower network speed, e.g., "Fast 3G" instead of "Online" in Google Chrome[4]. The player should adapt and download future video segments of lower quality levels.

To observe this behaviour, we can use the analysis graph provided by the DASH player (at the bottom of the web page). It dynamically updates with the “Video Bitrate" of the segment being downloaded (as well as the current “Video Buffer Level"). Take a screenshot of the graph showing the adaptive streaming capability, i.e., the bitrate graph should drop/increase over time (along with the buffer level). A screenshot sample is given as below, which may not be the correct answer.

 

Figure 3: A sample of the graph for illustrating the adaptive streaming capability.


Exercise 2 - Basic Media Processing (1 Mark)
In this exercise, we will learn how to use FFmpeg for basic media processing. FFmpeg is a collection of libraries and tools to process multimedia content such as audio, video, subtitles and related metadata. You need to first download and install FFmpeg on your local machine. The following links may help: https://ffmpeg.org/download.html

To validate that the package is installed correctly, use the ffmpeg -version command, which prints the FFmpeg version.

In this exercise, we will use the MP4 file titled test.mp4 available at -

LumiNUS → Files → Assignment Questions →A3-video.zip

Please download test.mp4 to your local device and look into the two following tasks.

Task 1: Checking the metadata of MP4 file (0.5 mark)
Similar to photos, videos contain metadata information about the location where the video was shot. Likewise, container formats like AVI and MP4 contain meta information about codecs, video and audio streams and more. A metadata viewer reveals information of video files you may not be aware of.

To check the metadata of the given MP4 file, run ffmpeg command as follows:

ffmpeg -i test.mp4

View the output content and answer the following five questions -

1.     What is the resolution of the video?

2.     What is the frame rate of the video?

3.     Which video compression standard is used in this video?

4.     Does this file employ RGB for color encoding?

5.     What is the sampling rate of the audio?


Task 2: Processing MP4 file (0.5 mark)
FFmpeg enables us to process the video file with very simple commands. For example, we can type ffmpeg -i test.mp4 test.flv for converting video files to FLV format. To set the aspect ratio (e.g., 16:9) to video, enter the command as ffmpeg -i test.mp4 aspect 16:9 output.mp4, where output.mp4 is the output file. For more operations such as video cropping, please refer to the official document[5]. In this task, you are asked to use ffmpeg to implement the following operations within one command:

1.     Take test.mp4 as input.

2.     Change the resolution to 640×480.

3.     Extract the video from 00:00:17 to 00:00:27.

4.     Remove the audio stream.

5.     Convert the format of above result to .avi and save it as e2_t2_output.avi.

To evaluate the result, you can either open the AVI file with your favorite video player or use FFmpeg to print its metadata for comparison. For example, the metadata of the new file could be:

 

Figure 4: The metadata of the processed file.





Exercise 3 - Real-Time Conversational Applications (1 Mark)
In the lecture, we also learnt about RTP (Real-Time Protocol) and WebRTC (Web browsers with Real-Time Communications), two other popular multimedia streaming standards. There are two tasks in this exercise to help us better understand these protocols.

Task 1: Streaming MP4 to YouTube Live (0.5 mark)
Live streaming is becoming popular means of entertainment. In Exercise 2, we have shown the power of FFmpeg in media processing. This task will illustrate how one can stream a video file, screencast, or web camera output to a popular live streaming platform (i.e., YouTube Live) using FFmpeg.

First of all, you need to register a YouTube account and apply for a key for YouTube

Please
do
take
note
that
the
Live
ac
count
will
be
re
viewed
in
24
hours
Live[6].. We advise you to complete it ahead of time. After that, we can access the interface of the studio, as shown in the following figure. The following link illustrates how to stream IP Camera to YouTube Live:

https://www.youtube.com/watch?v=MM2oTTb5zXg&t=353s&ab_channel=RickMakes

 

Figure 5: Interface of YouTube Live.

In this task, you are required to follow the above instruction to stream your camera[7] and answer the following questions.

During the live streaming,

1.     Which application layer protocol is used for upstreaming?

2.     Which protocol is used in the transport layer? Why?

3.     Record your command for pushing streams.

4.     Capture a screenshot of your live room clearly showing:

•     the video content

•     sending a message with your matric number in the chat room.

An example is given below.

 



Task 2: Inspecting a Simple WebRTC Application (0.5 mark)
To understand how WebRTC works in real-life applications, let’s run a simple experiment (similar to Exercise 1). For our WebRTC client in this exercise, we will use a browser-based video chat application provided by the WebRTC project authors at -

https://appr.tc/

Please access the application using any modern browser (we recommend Google Chrome) and create a chat room for your use. As this is a two-way peer-to-peer communication application, we need another device to join the same chat room created. We recommend using your mobile phone or working with another student to get the video chat running.

Once the video chat is running, open and view the browser’s network log and Wireshark’s capture log. Note that we can only view very limited information about the WebRTC packets in Wireshark because they run on additional security protocols that encrypt most of the stream information. (We prefer to keep it secure in this exercise as it streams from your camera feed.) Answer the following two discussion questions -

1.     You should notice that the browser’s network log here behaves differently from DASH’s case in Exercise 1. Specifically, the browser’s network log for WebRTC does not update with the video data retrieved, as seen in DASH. Why do you think this is so?

Answer only Question 2:

2.     Which transport protocol(s) do WebRTC and DASH generally use for their video streams? Give two reasons why they use the same/different transport protocol.

ENJOY
 
[1] If you get a URL "Not Found" error on your browser, please check that there is no additional space character in the link you pasted.
[2] For Google Chrome, right-click "Inspect" and click on the "Network" tab. For more details, refer to https://developers.google.com/web/tools/chrome-devtools/network.
[3] https://www.wireshark.org/download/docs/user-guide.pdf 4See https://wiki.wireshark.org/CaptureFilters
[4] See instructions in https://developers.google.com/web/tools/chrome-devtools/devicemode
[5] https://www.ffmpeg.org/ffmpeg.html
[6] https://support.google.com/youtube/answer/2474026?hl=en&ref_topic=9257984
[7] If you are testing your own file, please take note that you have to follow the copyright policy.

More products