Starting from:

$25

ML-Assignment 3 SVM & ANN Solved

Perform the following implementation and analysis of binary SVM classification on the provided data                                                                                                     
Randomly pick 80% of the data as a training set and the rest as a test set. Implement the binary SVM classifier using the following kernels:  
Linear
Quadratic
Radial basis function
Report both the training and test set classification accuracies for the most suitable value of generalization constant C for each of the three kernels. Consider C empirically. Report the accuracy in the form of a comparison table with corresponding C values (C value which provides the best test set accuracy is the most suitable one out of all those different trial values that you try.) 
Build a MLP classifier for the given dataset. Use stochastic gradient descent optimiser for the models. Find the number of nodes in the input and output layer according to the dataset and justify it in the report. Randomly pick 80% of the data as a training set and the rest as a test set. Use packages of pytorch.                                                
Vary the number of hidden layers and number of nodes in each hidden layer as follows.                                                                                             
0 hidden layer
1 hidden layer with 2 nodes
1 hidden layer with 6 nodes
2 hidden layers with 2 and 3 nodes respectively
2 hidden layers with 3 and 2 nodes respectively
For each of the architectures, vary the learning rates as 0.1, 0.01, 0.001, 0.0001, 0.00001. Plot graph for the results with respect to accuracy. (Learning rate vs accuracy for each model and model vs accuracy for each learning rate.) 
Mention the architecture and hyper parameters (including optimizer and learning rate) of the best found model in the report. Try to justify it.                     
(Optional) Compare execution time in CPU and GPU for each model. Conclude your findings.                                                                                             
Compare the performances of both the classifiers

More products