$29.99
There is no reproducibility component to this homework, so you only need to upload this assignment to Gradescope. You do not need to submit your solution to the lab exercise since it’s not worth any points.
General instructions for homeworks: Please follow the uploading file instructions according to the syllabus. You will give the commands to answer each question in its own code block, which will also produce plots that will be automatically embedded in the output file. Each answer must be supported by written statements as well as any code used. Your code must be completely reproducible and must compile.
Commenting code Code should be commented. See the Google style guide for questions regarding commenting or how to write code https://google.
github.io/styleguide/Rguide.xml. No late homework’s will be accepted.
1. Lab component (0 points total) Please refer to module 2 and lab 3 and complete tasks 3—5. This will not be graded as the entire solution is already posted. You will still be responsible for this material on the exam.
(a) (0) Task 3
(b) (0) Task 4 (c) (0) Task 5
1
2. (15 points total) The Uniform-Pareto
The goal of this problem is to continue getting more practice calculating the posterior distribution.
Suppose a < x < b. Consider the notation I(a,b)(x), where I denotes the indicator function. We define I(a,b)(x) to be the following:
(
1 if a < x < b,
I(a,b)(x) = 0 otherwise.
Let X be a random variable and let x be an observed value. Let
X = x | θ ∼ Uniform(0,θ) θ ∼ Pareto(α,β),
where . Write out the likelihood p(X = x | θ). Then calculate the posterior distribution of θ|X = x.
Hint 1: To set up this problem, do the following:
.
Hint 2: You cannot drop the indicators. You must write the indicators as functions of θ since this is the random variable.
Hint 3: It will end up being an updated Pareto.
3. (15) points total) The Bayes estimator or Bayes rule
The goal of this problem is to practice a similar problem that we considered in Module 2, where we derived the Bayes rule under squared error loss and found the result was the posterior mean.
(a) (5 pts) Find the Bayes estimator (or Bayes rule) when the loss function is L(θ,δ(x)) = c (θ − δ(x))2, where c > 0 is a constant.
2