Starting from:

$30

IE684-Lab4 Solved

Exercise 1: Newton’s method

Recall that in the last lab, we had implemented Newton’s method as a specific case of gradient descent with scaling. In this lab, we will focus on the performance of Newton’s method on some problems. We consider the Newton’s method implementation illustrated in Algorithm 1.

 

 Input: Starting point x0, Stopping tolerance τ Initialize k = 0 while ∥∇f(xk)∥2 > τ do ηk = argminη≥0 f(xk − η(∇2f(xk))−1∇f(xk)) xk+1 = xk − ηk(∇2f(xk))−1∇f(xk)

k = k + 1

end

Output: xk.

Algorithm 1: Newton’s method

 

1.    Please reuse the code you had written in the past labs to implement the Newton’s method in Algorithm 1.

2.    Consider the function . Write code for implementing all relevant modules necessary for Newton’s method and gradient descent method to solve minx f(x).

3.    [R] Consider ηk = 1,∀k = 1,2,... in Algorithm 1. With starting point x0 = (2,2) and a stopping tolerance τ = 10−9, find the number of iterations taken by the Newton’s method. Compare the number of iterations with that taken by Newton’s method (with backtracking line search) in Algorithm 1. Note the minimizer and minimum objective function value in each case. Comment on your observations.

4.    [R] Compare the number of iterations obtained for the two variants of Newton’s method in the previous question, with that of gradient descent algorithm (without scaling) with backtracking line search, gradient descent algorithm (with scaling using a diagonal matrix) with backtracking line search (implemented in previous labs), with starting point (2,2). For backtracking line search, use α0 = 1,ρ = 0.5,γ = 0.5. Also compare the minimizer and minimum objective function value in each case. Comment on your observations.

Exercise 2:

Consider the function

 .

1.    Write code for implementing all relevant modules necessary for Newton’s method and gradient descent method to solve minx q(x).

2.    [R] Consider ηk = 1,∀k = 1,2,... in Algorithm 1. With starting point x0 = (2,2) and a stopping tolerance τ = 10−9, find the number of iterations taken by the Newton’s method. Compare the number of iterations with that taken by Newton’s method (with backtracking line search) in Algorithm 1. Note the minimizer and minimum objective function value in each case. Comment on your observations.

3.    [R] Compare the number of iterations obtained for the two variants of Newton’s method in the previous question, with that of gradient descent algorithm (without scaling) with backtracking line search (implemented in previous labs), with starting point (2,2). For backtracking line search, use α0 = 1,ρ = 0.5,γ = 0.5. Also compare the minimizer and minimum objective function value in each case. Comment on your observations.

4.    [R] Consider ηk = 1,∀k = 1,2,... in Algorithm 1. With starting point x0 = (8,8) and a stopping tolerance τ = 10−9, find the number of iterations taken by the Newton’s method. Compare the number of iterations with that taken by Newton’s method (with backtracking line search) in Algorithm 1. Note the minimizer and minimum objective function value in each case. Comment on your observations.

5.    [R] Compare the number of iterations obtained for the two variants of Newton’s method in the previous question, with that of gradient descent algorithm (without scaling) with backtracking line search (implemented in previous labs), with starting point (8,8). For backtracking line search, use α0 = 1,ρ = 0.5,γ = 0.5. Also compare the minimizer and minimum objective function value in each case. Comment on your observations.

More products