$35
Introduction to Deep Learning
Homework Assignment 2
Problem 1 (Practice of scalar‐based backpropagation). You are required to calculate the gradients of 𝑓 𝐱, 𝐰 with respect to xi
and wi.
(a) Use computational graph for calculation
(b) Based on(a), write a program to implement the computational graph and verify your answer in (a).
Note: You are free to choose the inputs of your computational graph.
Problem 2 (Practice of vector‐based backpropagation). You are required to calculate the gradients of 𝑓 𝑥, 𝑤 ‖𝜎 𝐖𝐱 ‖ with respect to xi and Wi,j. Here ‖∙‖ is the calculation of L2 loss, W is 3‐by‐3 matrix and x is 3‐by‐1 vector, and 𝜎 ∙ is sigmoid function that performs element‐wise sigmoid operation.
(a) Use computational graph for calculation
(b) Based on(a), write a program to implement the computational graph and verify your answer in (a). You can use vectorized approach to simply your codes.
Note: You are free to choose the inputs of your computational graph.