Starting from:

$30

DLP-Lab 1 Back-Propagation Solved

In this lab, you will need to understand and implement simple neural networks with forwarding pass and backpropagation using two hidden layers. Notice that you can only use Numpy and the python standard libraries, any other frameworks (ex : Tensorflow、PyTorch) are not allowed in this lab. 

   

Figure 1. Two-layer neural network Important 

 

Requirements:  

1.     Implement simple neural networks with two hidden layers.

2.     You must use backpropagation in this neural network and can only use Numpy and other python standard libraries to implement.

3.     Plot your comparison figure that show the predicted results and the groundtruth.

 

Implementation Details:  

  

Figure 2. Forward pass 

•        In the figure 2, we use the following definitions for the notations:

1.         𝑥1, 𝑥2 ∶ 𝑛𝑒𝑟𝑢𝑎𝑙 𝑛𝑒𝑡𝑤𝑜𝑟𝑘 𝑖𝑛𝑝𝑢𝑡𝑠

2.         𝑋 ∶ [𝑥1, 𝑥2]

3.         𝑦 ∶ 𝑛𝑒𝑟𝑢𝑎𝑙 𝑛𝑒𝑡𝑤𝑜𝑟𝑘 𝑜𝑢𝑡𝑝𝑢𝑡𝑠

4.         𝑦̂ ∶ 𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑟𝑢𝑡ℎ

5.         L(θ) ∶ 𝑙𝑜𝑠𝑠 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛

6.         𝑊1, 𝑊2, 𝑊3  ∶ 𝑤𝑒𝑖𝑔ℎ𝑡 𝑚𝑎𝑡𝑟𝑖𝑥 𝑜𝑓 𝑛𝑒𝑡𝑤𝑜𝑟𝑘 𝑙𝑎𝑦𝑒𝑟𝑠

 

•        Here are the computations represented:

𝒁𝟏 = 𝝈(𝑿𝑊1)               𝒁𝟐 = 𝝈(𝒁𝟏𝑊2)             𝒚 = 𝝈(𝒁𝟐𝑊3)                                 

 

•        In the equations, the 𝝈 is sigmoid function that refers to the special case of the logistic function and defined by the formula:

   

 

•        Input / Test: 

The inputs are two kinds which showing at below.  

  

 

You need to use the following generate functions to create your inputs x, y.

 

  

 

In the training, you need to print the loss values; In the testing, you need to show your predictions as shown below.  

  

Visualize the predictions and ground truth at the end of the training process. The comparison figure should like example as below.

 

You can refer to the following visualization code x: inputs (2-dimensional array) y: ground truth label (1-dimensional array)

pred_y: outputs of neural network (1-dimensional array)

 

 

 

•        Sigmoid functions: 

1.     A sigmoid function is a mathematical function having a characteristic "S"-shaped curve or sigmoid curve. It is a bounded, differentiable, real function that is defined for all real input values and has a non-negative derivative at each point. In general, a sigmoid function is monotonic, and has a first derivative which is bell shaped.  

2.     (hint) You may write the function like this:

  

3.     (hint) The derivative of sigmoid function

  

 

 

•        Back Propagation (Gradient computation)

Backpropagation is a method used in artificial neural networks to calculate a gradient that is needed in the calculation of the weights to be used in the network. Backpropagation is a generalization of the delta rule to multilayered feedforward networks, made possible by using the chain rule to iteratively compute gradients for each layer. The backpropagation learning algorithm can be divided into two parts; propagation and weight update.

 

Part 1: Propagation 

Each propagation involves the following steps:

1.         Propagation forward through the network to generate the output value

2.         Calculation of the cost L(θ) (error term)

3.         Propagation of the output activations back through the network using the training pattern target in order to generate the deltas (the difference between the targeted and actual output values) of all output and hidden neurons.

 

 

Part 2: Weight update 

For each weight-synapse follow the below steps:

1.     Multiply its output delta and input activation to get the gradient of the weight.

2.     Subtract a ratio (percentage) of the gradient from the weight.

3.     This ratio (percentage) influences the speed and quality of learning; it is called the learning rate. The greater the ratio, the faster the neuron trains; the lower the ratio, the more accurate the training is. The sign of the gradient of a weight indicates where the error is increasing, this is why the weight must be updated in the opposite direction.

 

Repeat part. 1 and 2 until the performance of the network is satisfactory. 

 

Pseudocode: 

  

 

More products