Starting from:

$30

ML-Homework 2 Naive Bayes Classifier Solved

Create a Naive Bayes classifier for each handwritten digit that support discrete and continuous features.

       Input:

1.   Training image data from MNIST

 You Must download the MNIST from this website and parse the data by yourself. (Please do not use the build in dataset or you'll not get 100.) Please read the description in the link to understand the format.

 Basically, each image is represented by              bits (Whole binary file is in big endian format; you need to deal with it), you can use a char arrary to store an image.

    There are some headers you need to deal with as well, please read the link for more details.

2.   Training lable data from MNIST.

3.   Testing image from MNIST

4.   Testing label from MNIST 5. Toggle option

 0: discrete mode

1: continuous mode

TRAINING SET IMAGE FILE (train-images-idx3-ubyte)                         

offset
type
value
description
0000
32 bit integer
0x00000803(2051)
magic number
0004
32 bit integer
60000
number of images
0008
32 bit integer
28
number of rows
0012
32 bit integer
28
number of columns
0016
unsigned byte
??
pixel
0017
unsigned byte
??
pixel
...
...
...
...
xxxx
unsigned byte
??
pixel
TRAINING SET LABEL FILE (train-labels-idx1-ubyte)                            

offset
type
value
description
0000
32 bit integer
0x00000801(2049)
magic number
0004
32 bit integer
60000
number of items
0008
unsigned byte
??
label
0009
unsigned byte
??
label
...
...
...
...
xxxx
unsigned byte
??
label
The labels values are from 0 to 9.

 Output:

Print out the the posterior (in log scale to avoid underflow) of the ten categories (0-9) for each image in INPUT 3. Don't forget to marginalize them so sum it up will equal to 1.

   For each test image, print out your prediction which is the category having the highest posterior, and tally the prediction by comparing with INPUT 4.

 Print out the imagination of numbers in your Bayes classifier

For each digit, print a   binary image which 0 represents a white pixel, and 1 represents a black pixel.

The pixel is 0 when Bayes classifier expect the pixel in this position should less then 128 in original image, otherwise is 1. Calculate and report the error rate in the end.

 Function:

1.   In Discrete mode:

    Tally the frequency of the values of each pixel into 32 bins. For example, The gray level 0 to 7 should be classified to bin 0, gray level 8 to 15 should be bin 1 ... etc. Then perform Naive Bayes classifier. Note that to avoid empty bin, you can use a peudocount (such as the minimum value in other bins) for instead.

2.   In Continuous mode:

    Use MLE to fit a Gaussian distribution for the value of each pixel. Perform Naive Bayes classifier.

More products