Starting from:

$25

CS433-Labs ex 5 Solved

Exercise 1:

Classification using linear regression.

•    Use least squares finished in previous exercise to compute a w. Please COPY your previous implementation to the template least squares.py.

•    Visualize the data points and the decision boundary as Figure 1.

2           Logistic Regression
Exercise 2:

Implement logistic regression using gradient descent.

•    Fill in the notebook function sigmoid().

•    Fill in two notebook functions calculate loss() and calculate gradient(). The first function should return negative of the value of log-likelihood, while the second function should return the corresponding gradient.

•    Implement the gradient descent learning by gradient descent() for logistic regression. You should calculate loss, gradient and update the weight w, and the function should return loss and updated weight.

 

Figure 1: classification by least square.

•    Plot predictions to get a visualization similar to the right one of Figure 1. Check if you get similar or different results.

•    Do your results make sense?

Once you have gradient descent, it is also straightforward to implement Newton’s method.

Exercise 3:

Newton’s method for logistic regression.

•    Fill in the notebook function calculate hessian(). And then integrate your implementation, i.e., calculate loss(), calculate gradient() and calculate hessian(), to the function logistic regression() for future usage. The function should return the cost function, gradient, and Hessian altogether.

•    Your gradient descent code can now be turned into a Newton’s method algorithm. Please fill the notebook function learning by newtonmethod(). The function should return your loss and updated weight.

Exercise 4:

Penalized logistic regression.

•    Fill in the notebook function penalized logistic regression(). Note that it can be done by adding the regularization term λ󰀂w󰀂2. Set λ to a low value and check if it gives the same result. Once it is done, please fill in the notebook function learning by penalized gradient(), and increase the value of λ and check whether w is shrinking or not.

•    Check if this gives the same answer as gradient descent. To debug, print the function value and the norm of the gradient in every iteration. All of these values should decrease in every iteration.

More products