Starting from:

$25

SMAI-Homework 8 Solved

1         Objective Question
Consider a K class classification problem. Total number of binary classifier trained in a one-vs-all setting are            , and for one-vs-one classifier setting are      .

2         Objective Question
Which of the following methods do we use to best fit the data in Logistic Regression?

1.   Least Square Error

2.   Maximum Likelihood

3.   Jaccard Distance

4.   Both A and B

3         Programming Question
In tutorial, we saw how to build a multi-class classifier using a one-vs-all setting. In this problem you are required to construct a one-vs-one classifier for the same problem. The starter code is provided in ‘Logistic Regression Excercise 1.ipynb’.

4         Subjective Question
Suppose you train a logistic regression classifier and your hypothesis function h is

hθ(x) = g(θ0 + θ1x1 + θ2x2)

where,

θ0 = 6, θ1 = 0, θ2 = −1

Draw the decision boundary for the given classifier (a rough sketch is sufficient). What would happen to the decision boundary if you replace the coefficient of x1 and x2? Draw the decision boundary for the second case as well.

1

5         Programming Question
You have been given a dataset of handwritten digits, the MNIST dataset. The dataset consists of 28×28 pixel images consisting of one of 10 digits (0,1,...,9) that are handwritten. Using logistic regression in one-vs-one and one-vs-all setting construct a 10-class classifier. Report the test accuracy for one-vs-one and one-vs-all classifier. The starter code is provided in ‘Logistic Regression Excercise 2.ipynb’.

More products