Starting from:

$30

Machine-Learning-Homework 3  Solved

TAs' email address: jhhlab.tw@gmail.com 

 

Description: 

 

1.    Random Data Generator 
 Generating values from normal distribution 

 , 

2.    Sequential Estimator 
Sequential estimate the mean and variance 

 Data is given from the univariate gaussian data generator (1.a). Input: as in (1.a) Function: 

Call (1.a) to get a new data point from   

                          Use sequential estimation to find the current estimates to       and 

3.      Baysian Linear regression 
 Input 

The precision (i.e., b) for initial prior  

All other required inputs for the polynomial basis linear model geneartor (1.b) Function 

Call (1.b) to generate one data point 

Update the prior, and calculate the parameters of predictive distribution Repeat steps above until the posterior probability converges. 

Output 

Print the new data point and the current paramters for posterior and predictive distribution. 

 After probability converged, do the visualization 

Ground truth function (from linear model generator)

Final predict result 

At the time that have seen 10 data points

At the time that have seen 50 data points

Except ground truth, you have to draw those data points which you have seen before 

Draw a black line to represent the mean of function at each point

 Draw two red lines to represent the variance of function at each point 

In other words, distance between red line and mean is ONE variance

 Hint: Online learning 

Sample input & output (for reference only)

1.   b = 1, n = 4, a = 1, w = [1, 2, 3, 4] 

 

 
30 

31 
 

Predictive distribution ~ N(0.06869, 1.66008) 
 
 
32 

33 
       

Add data point (-0.19330, 0.24507): 
 
 
34 

35 
 

Postirior mean: 
 
36 
0.5760972313 
37 
0.2450231522 
38 
-0.0801842453 
39 
           0.0504992402                                                                                  
40 

41 
                                                 

   Posterior variance:                                                                             
42 
0.2867129751, 0.1311255325, -0.0767580827, 0.0438488542 
43 
0.1311255325, 0.7892001707, 0.1242887609, -0.0801412282 
44 
-0.0767580827, 0.1242887609, 0.9176812972, 0.0541575540 
45 
0.0438488542, -0.0801412282, 0.0541575540, 0.9642058389 
46 

47 
                 

    Predictive distribution ~ N(0.62305, 1.34848)     
48 

49 

50 

                 

   ...                                             
51 

52 

53 
                 

     

   Add data point (-0.76990, -0.34768):              
54 

55 
                 

   Postirior mean:                                  
56 
           0.9107496675                                  
57 
           1.9265499885                                  
58 
           3.1119297129                                  
59 
           4.1312375189                                  
60 

61 
                 

   Posterior variance:                              
62 
0.0051883836, -0.0004416700, -0.0086000319, 0.0008247001 
63 
-0.0004416700, 0.0401966605, 0.0012708906, -0.0554822477 
64 
-0.0086000319, 0.0012708906, 0.0265353911, -0.0031205875 
65 
0.0008247001, -0.0554822477, -0.0031205875, 0.0937197255 
66 

67 
                                                 

Predictive distribution ~ N(-0.61566, 1.00921)  
68 

69 
     

    Add data point (0.36500, 2.22705):                
70 

71 
                 

   Postirior mean:                                  
72 
           0.9107404583                                  
73 
           1.9265225090                                  
74 
           3.1119408740                                  
75 
           4.1312734131                                  
76 

77 
                 

   Posterior variance:                              
78 
0.0051731092, -0.0004872471, -0.0085815201, 0.0008842340 
 


 

2.   b = 100, n = 4, a = 1, w = [1, 2, 3, 4] 

 

 


3.   b = 1, n = 3, a = 3, w = [1, 2, 3] 


More products