Starting from:

$30

EECS491-Assignment 3 Solved

Exercise 1. MRFs and Images Denoising 
In this problem, you will implement the image de-noising example using a Markov Random Field (MRF). This material on MRFs is covered in the textbook (Barber) in chapter 4.2.5. The lecture and this problem is
based on the presentation in Bishop in chapter 8.3, which is available online.
As discussed in class, energy function for this MRF is
where the binary variables represent the unknown, noise-free image pixels, which are binary, i.e. black or white, and indicates the neighbords of node . The variables represent the observed noisy
pixels, i.e. the pixel could randomly change from black ( ) to white ( ) or vice-versa.
The corresponding joint probability distribution over the variables is
1.1  Derive the equation that specifies the change in the energy equation when one variable changes state.
1.2  Write a program to iteratively (or in random order) update the state variables to minimize the energy (maximize the probability). Explain the design of your code.
1.3 Show that your update algorithm minimizes the energy function and converges by plotting the energy vs the number of passes through the set of pixels.
1.4 Illustrate the model by showing the state of the denoised image at the points: at the start before updating, when it is about 50% converged in terms of energy minimization, and at the end when it
converges. Choose images that aren't too high resolution so that the individual pixels are visible as squaures. You may also do a live plot in a notebook to show it updating continuously, but make sure you have
the static plots too in case the dynamic plot has portability issues.
1.5  Experiment with different settings of the energy equation parameters and explain your results in terms of their effect on the energy equation.
1.6  Generalize the energy equation so that the model better captures different structure. Explain your rationale behind this new model (i.e. terms in the equation). Illustrate it with denoising examples
(other types of images) with are not well-handled by the previous model.
Exercise 2. Graphical Representation 
2.1  For the Bayesian network show above, draw the corresponding Markov Random Field (MRF), and write out the joint probability using potential functions. You do not need to specify the functions
themselves, only which arguments they take. What are the potential functions in terms of the Bayes net?
2.2  Now specify the Bayes net as a factor graph. Again write the expression for the joint probability, but using factor functions.
2.3  Express the following Bayes net (from the sprinkler example) in two different factor graphs. For each network, write the factors as a function of the conditional probabilties and specify the joint
probability.
Exercise 3. The Sum Product Algorithm 
Consider the following factor graph.
3.1  Apply the sum-product algorithm to compute the all messages when none of the variables are known. In your answers, you do not need to substitute in the values of other messages, i.e. your answers
should be in terms of local factors and other messages.
3.2  Compute the marginal probability , expressing it in terms of the messages you derived in the previous question.
3.3  Verify that the marginal is the correct expression substituting in the message definitions.
Now consider adding a loop to the graph.
3.4  Explore the consequences of applying the sum-product algorithm to this graph. Can the algorithm still be applied?
Exploration 
Like in previous assignments, in this exercise you are meant to do creative exploration. You don't need to write a book chapter, but the intention is for you to go beyond what's been covered above or explore new
topic altogether.
One suggestion: Select a probabilistic programming package and work through and explain in your own words an example of interest to you from the documentation or tutorial. Many of the simple tutorials will
cover models similar to what we have discussed in class. Be sure your explanation is from your own perspective and understanding; do not simply copy the tutorial. You should add your own unique variations and
modifications to better present the ideas or other things you found interesting. The purpose of this exercise is simply to make you aware of some of the modern tools that are available for probabilistic modeling.
There are several packages available to choose from and they are all under active development as this is an active field of research. In python, popular packages are PyMC3 and TensorFlow Probability. In Julia,
popular packages are Gen and Turing. There are many other choices, so feel free to choose something else if you find it interesting.

More products