$35
Homework 4
Convex Optimization
1. Consider the optimization
min ex1+3x2−0.1 + ex1−3x2−0.1 + e−x1−0.1
Write a code to solve this optimization using the gradient method with the backtracking parameters α = 0.1 and β = 0.6. Draw f(x(k)) verses k for k = 0,1,2,··· ,50 on a log-linear plot.
2. Consider the optimization
min
where n = 5000 and ai are randomly generated vectors. Solve this optimization using Newtons method with the backtracking line search (α = 0.01 and β = 0.5). Draw f(x(k)) versus k for k = 0,1,2,··· ,30 on a log-linear plot.
3. Derive the distributed ADMM updates for SVM problem.
1