Home / Week 11 Exercises / Markov Chain

Markov Chain

The questions below are due on Sunday April 28, 2019; 11:00:00 PM.
 
You are not logged in.

If you are a current student, please Log In for full access to this page.
Music For This Problem

In this problem, we will define several methods that operate on instances of the class MarkovChain, which represent Markov chains.

The MarkovChain has an __init__ method that takes two arguments: an instance of dist.DDist representing the distribution over states at time 0, \Pr(S_0), and a function representing the transition model \Pr(S_{t+1}~|~S_t). We have provided this method for you.

Define two methods that operate on a Markov chain:

  • state_sequence_prob(seq) should take as its argument a list of states starting at time 0, and should return the probability that the systems states at time 0,1,2,... will form the sequence seq
  • occupation_dist(T) should take as its argument a time T, and should return an instance of DDist representing the distribution over states at that time. Hint: What should the method return if T is 0?

Enter your definitions for these procedures below. You may assume that all classes and functions from the lib601.dist module have been defined for you with:

from lib601.dist import *

Code Skeleton
  No file selected