Posted by: ardaayvaz | October 18, 2008

Markov Chains

Since may people are posting interesting entries about probability, I thought it might be interesting to introduce the topic of Markov Chains(the discrete version).

Markov Chains take their name from Andrey Markov, who first studied these stochastic processes and according to Joe Blitzstein, he first applied it to guess which letters are most likely come after each other by analyzing the existing Chekhov stories.

A Markov Chain basically uses the Markov property: given the present state, future states are independent of the past states.

Therefore, for a given set of states, we have state transition probabilities for a given time. We can present these at a n by n transition matrix, where we have n states. And by the use of matrices, we can reduce complex stochastic problems into simple linear algebra problems.

I will skip the technical explanation of Markov Chains, because I can’t type mathematical notation, but I can refer all of you to a couple of websites, where you can read more about it.

http://en.wikipedia.org/wiki/Markov_chain

http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter11.pdf

Also PlanetMath and Wolfram MathWorld has brief explanations.

At a basic level it is a very simple mathematical concept and we can solve problems at a very elegant way. Google’s PageRank is basically a large-scale Markov Chain. Similarly, it is extensively used in finance, economics, physics etc.

Another good example of Markov Chains is the Gambler’s Ruin problem, which is a simple random walk where at each step we can either go to state i-1 or i+1 for certain probabilities given that we are in state i. Therefore we have a specific transition matrix, where only such transitions have according probabilites, and the rest is 0. We can find the probability of going to state i from state j at a given number of steps by taking the according power of the transition matrix, and read of the corresponding state transition probability.

Again, it is hard to explain these in mathematical detail in a blog post, I really think this is a fascinating part of probability theory, so definitely take a look at the links I provide.

Advertisements

Responses

  1. Great! I explain Google’s PageRank when I teach linear algebra, because Markov chains are one of the important applications of that subject.

    By the way, it’s fine to leave the details to Mathworld or Planet Math, since it would take a long time, but you can type mathematics into these blog posts! Click on LaTeX at the top of the blog (one of the static pages) to see how.

  2. I love Markov chains. They are adorable and versatile and you can do pretty much anything with them – stocks, gambles, frisky reproductive rabbits, computer life, nuclear reactions, neurotransmitters and other drugs…

    In fact, if your back is up against the wall and you discover that you find yourself during Reading Period with exactly 42 hours left to write (from scratch) your Math Modeling paper which falls under one of the requirements for Applied Math concentrators at Harvard (not that this EVER happens, of course), a Markov chain model is the easiest thing you can spin out of thin air! An absolute miracle worker.

    Just thought I’d get that out in the open. Please don’t tell Prof. Fudenberg.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Categories

%d bloggers like this: