Starting from:

$25

CPSC585- Project 1 Solved

Platforms
For this project (and, in general, for most machine learning or data science projects) you will need a ​Jupyter notebook​ with Python 3. Jupyter allows you to create documents mixing text, equations, code, and visualizations.

The Jupyter project itself ​recommends​ ​Anaconda​ if you intend to run notebooks locally on a laptop or desktop computer, but there are several cloud services that offer free Jupyter notebooks, including ​MIcrosoft Azure Notebooks​ and ​Google Colaboratory​.

Libraries
You may write your code using standard Python 3, but you are encouraged to use ​NumPy​ to implement vector operations such as dot product. Use the ​Matplotlib​ library to generate plots. You may ​not ​use machine learning libraries such as scikit-learn.

Dataset
The file ​dataset.py​ contains the letters A through Z as 5-by-7 dot-matrix fonts, in a format similar to the ​Hebb Net example​ for character recognition. There are two different fonts, one for training and one for test.

Training Perceptrons
Implement one perceptron for each letter, for a total of 26. Each perceptron should learn to output 1 for its assigned letter and 0 for all other letters. Begin with weights and biases initialized to random values (rather than zero) for each perceptron, and apply the ​perceptron learning algorithm​ until all items in the training set are classified correctly or until it becomes clear that the weights will not converge.

As each item in the training set is classified by each perceptron, record whether the output was correct or not. At the end of each pass through the training set by a perceptron, record the error rate (number of misclassified items) of that perceptron for that epoch.

If a letter is linearly separable from the others and the learning rate is small enough, you should see a downward trend in error rate until each point is classified correctly. You may wish to use Matplotlib to plot the error rate as a function of the number of epoch​    s ​ in order to visualize this trend.

Q. Are all letters in the training set linearly separable?

Testing the Learned Weights
Once the perceptrons have been trained (in other words, once the perceptrons for the letters which are linearly separable from the other letters have converged), test each trained perceptron against the letters in the test set.

Q. Are all letters in the test set correctly classified?

If any items in the test set are misclassified, compare them to the corresponding items in the training set.

Q. How similar are the misclassified items to the items in the training set?

More products