Starting from:

$40

NCTU-CS Assignment #3 - Linear Regression & Logistic Regression Solved

Introduction to Machine Learning Program Assignment #3 - Linear Regression & Logistic Regression 


This programming assignment aims to help you understand Linear Regression and Logistic Regression.

can still use discord for Q&A.

Join the discord server for TA support

This is the same one as the program assignment #1 and #2 used.
Ask questions on it, and we shall reply. (We won’t respond to raised hands.)
Try not to ask for obvious answers or bug fixes.
Memes and chit chat welcome
Objective
·       Linear Regression - 55% + (10%)

Data Generation - 15% 
Randomly generate 1000 (xi,yi)(xi,yi) pairs which follow the equation (1)(1)
yi=3x3i+2x2i−3xi+1+ϵi(1)(1)yi=3xi3+2xi2−3xi+1+ϵi


where −1.5<xi<1.0−1.5<xi<1.0, ϵi∼N(0,0.5)ϵi∼N(0,0.5) and NN represents Normal distribution
 

Data Preprocessing - 10% 
Generate degree-KK polynomial features x^x^ from xx
x^i=⎡⎣⎢⎢⎢⎢⎢⎢1xix2i…xKi⎤⎦⎥⎥⎥⎥⎥⎥x^i=[1xixi2…xiK]

You must experiments 44 different KK settings, K=1,2,3,4K=1,2,3,4
hint
Model Construction - 20% 
Linear Regression 
Which makes predictions y^=wx^y^=wx^, s.t.s.t.
w=argminw′||y−w′x^||2w=argminw′||y−w′x^||2

You must construct Linear Regression models to fit and predict data generated by (1)(1)
Validation - 0% 
Due to the simplicity of Linear Regression, you are not required to implement validation methods.
Results - 10% + (10%) 
Show the fitted weights and the equations
Show the predicted y^y^ for −1.5<x<1.0−1.5<x<1.0
Bonus - show the results in a single figure - (10%)
 
Legend equations must be written in LaTeXLaTeX
Use ×× instead of ∗∗ to represent multiplication operations
Use xixi instead of xx
Limit the floating-point numeric weights to be 22 decimal places 
i.e. no 1.543234234561.54323423456 but 1.541.54
There should be no redundant signs before weights, i.e no 1+−3.36×xi1+−3.36×xi
·       Logistic Regression - 45% + (10%)

Data Generation - 15% 
Randomly generate 1000 (xi0,xi1,yi)(xi0,xi1,yi) triplets which follows (2)(2)
[xi0xi1]∼N([yiyi],[0.1000.1])(2)(2)[xi0xi1]∼N([yiyi],[0.1000.1])


where yiyi is randomly assigned as 00 or 11.
 

Model Construction - 20% 
Logistic Regression 
Whose divider MwMw uses Logistic function LL to perform classification
Mw(xi)=L(w⋅x)=11+e−w⋅xMw(xi)=L(w⋅x)=11+e−w⋅x

Takes L2-norm as the objective function to optimize weight ww
w=argminw′||y−Mw′(x)||2w=argminw′||y−Mw′(x)||2

Construct a Logistic Regression model to predict yiyi from [xi0xi1]T[xi0xi1]T generated from equation (2)(2)
Validation - 0% 
Validation methods are not required in this assignment either.
Results - 10% + (10%) 
Show the model accuracy - 5%
Show the model weights and the corresponded terms - 5% 
e.g.
yi=L(4.2+7.7×xi0+6.9×xi1)yi=L(4.2+7.7×xi0+6.9×xi1)

Bonus - show the decision boundary with a figure - (10%)
 
·       Finish during class - 20%

Submit your report and source codes to the newE3 system before class ends.
Finish time will be determined by the submission time.
Submission & Scoring Policy
Please submit a zip file, which contains the following, to the newE3 system. 
Report 
Explanation of how your code works.
All the content mentioned above.
Your name and student ID at the very beginning - 10%
Accept formats: HTML
Source codes 
Accept languages: python3
Accept formats: .ipynb
Package-provided models are allowed
Your score will be determined mainly by the submitted report. 
if there’s any problem with your code, TA might ask you (through email) to demo it. Otherwise, no demo is needed.
Scores will be adjusted at the end of the semester for them to fit the school regulations.
Plagiarizing is not allowed. 
You will get ZERO on that homework if you get caught the first time.
The second time, you’ll FAIL this class.
Tools that might be useful
Jupyter Lab, pre-installed in PC classrooms
Numpy - Math thingy
matplotlib - Plot thingy
pandas - Data thingy
scipy - Science thingy
scikit-learn - 

More products