$29.99
Q1 (6pts): A friend informs you that a casino is using loaded dice, such that:
1112 ππ π ∈ {1, 2, 3}
ππππππ! = ππππ ππ π ∈ {4, 5, 6}
0 ππ‘βπππ€ππ π
Q1a: What is the entropy of a roll at this casino? Please use log base 2
Q1b: Imagine your friend is right, but you choose to give the casino the benefit of the doubt and assume fair dice. What’s the KL divergence of fair dice (Q) from the true
distribution (P)? i.e. calculate π·"#(ππππππ ππππ||ππππ ππππ)
Please use log base 2
Q1c: Imagine you choose to believe your friend, but it turned out the casino has since switched back to fair dice. What’s the KL divergence of the loaded dice (Q) from the true distribution (P)? i.e. calculate π·"#(ππππ ππππ||ππππππ ππππ)
Please use log base 2
Q2 (5pts): Given this ANN structure:
And the following parameter/function definitions:
W = [-15, -3, -2, 4, 1, 10] B = [4, 1, -0.5]
π$(π₯) = π%(π₯) = max (0.1π₯, π₯) π&(π₯) = π₯%
What are the intermediate and/or output values for the following data points?
Q2a: Data point: a = 0.5, c = 0.5
π′$ value: π′% value:
π′$ value:
Q2b: Data point: a = 1, c = 0
π′% value: π$ value:
π% value:
Q2c: Data point: a = 0, c = 1
π₯′$ value: π₯′% value:
π& value:
Q3 (4pts): Given a test data point:
Height = 200
Weight = 200
And the training dataset in the table below, use kNN classification with k=1, k=3, and k=5 to label the test data point. Break ties by increasing k by 1.
Show your work by filling in the table and writing in the model’s class label predictions.
Class Height Weight Manhattan Distance from test sample
1 105 114
1 92 169
1 87 140
2 111 109
2 79 44
2 92 55
3 265 331
3 330 284
3 185 309
Model predictions for:
k = 1 _________ k = 3 _________ k = 5 _________