$30
The problem is to simulate a network node. Packets arrive randomly at the switch, and are sent out every ∆t seconds. If too many packets arrive, they are queued. A maximum of N packets can be queued. To model the arrival of packets at the node, you can use the following function nextTime() which gives the next time a packet arrives for a Poisson process.
#include <math.h #include <stdlib.h int nextTime(float rateParameter)
{ return (int)(-logf(1.0f - (float) random() / (RAND_MAX + 1)) / ratePar }
The argument rateParameter is 1/∆t and a parameter of your simulation. If the queue grows too large, packets are dropped.
A packet is an instance of a structure as follows:
typedef struct{
int id; // packet id int t0; // arrival time of packet int priority; // higher means more important char contents[80]; // contents of packet
} PACKET;
We will not use priority here, but this can be a standard definition for a packet for later assignments. When a packet is to be added, you need to first allocate it via
PACKET *p;
p = (PACKET *)malloc(sizeof(PACKET));
and then add it to the queue. Note that queue is a circular array of pointers, each pointing to a packet or to NULL (if that queue element is not allocated
1. Write a program that will take λ= 1/∆t, N and µ (the rate at which packets are forwarded by the node) and simulate the queue. λ and µ are real while N is integer. The program must collect a history of the time spent in the queue and the percentage of packets dropped.
2. Vary the three parameters as follows and generate histograms for each case and enter the missing databelow:
1
N
λ
µ
Avg Delay
% dropped packets
Time per packet
5
0.45
0.5
?
?
?
10
0.45
0.5
?
?
?
20
0.45
0.5
?
?
?
5
0.4
0.5
?
?
?
5
0.3
0.5
?
?
?
and understand the way N and λ interact.
3. Is the time complexity to manage the queue per packet equal to O(1) or something else? Assume that all CPU and memory operations are the same cost (which they are not).