Uncertainty - The Marcov Chain
A Markov chain is a mathematical system usually defined as a collection of random variables, that transition from one state to another according to certain probabilistic rules. A Markov chain is a random process with the Markov property. A random process or often called stochastic property is a mathematical object defined as a collection of random variables. A Markov chain has either discrete state space (set of possible values of the random variables) or discrete index set (often representing time) - given the fact, many variations for a Markov chain exists. Usually the term "Markov chain" is reserved for a process with a discrete set of times, that is a Discrete Time Markov chain (DTMC).
In this blog, we will discover when you can use markov chains. My self Happy khatun. I am a student of city university. And this blog is a part of our AI (Artificial Intilegent) Lab Course conducted by our most honourable teacher Nuruzzaman Faruqui.
Here we will implement and discuss a marcove chain model to predict Weather.
For this we will need a transition model like this
It will specify the probability distributions of the next event based on the possible values of the current event. Now from this model we will implement a Markov model to predict tomorrows’ weather based on todays’ weather. Here we can see that the probability of tomorrow being sunny based on today's sunny weather which is 0.8. This is reasonable because it is not important that a sunny day will follow a sunny day.
However, if it is rainy today, the probability of rain tomorrow is 0.7, since rainy days are more likely to follow each other. If we think this probabilty in marcove chain then the chain will be like this
According this probability now we will conduct a python code to implement marcove chain
from pomegranate import *
# Define starting probabilities
start = DiscreteDistribution({
"sun": 0.5,
"rain": 0.5
})
# Define transition model
transitions = ConditionalProbabilityTable([
["sun", "sun", 0.8],
["sun", "rain", 0.2],
["rain", "sun", 0.3],
["rain", "rain", 0.7]
], [start])
# Create Markov chain
model = MarkovChain([start, transitions])
# Sample 50 states from chain
print(model.sample(50))
After running this code in pycharm we get the following result
we can see the result as much state we want. For this we have to set the number of state.
Markov Chains have prolific usage in mathematics. They are widely employed in economics, game theory, communication theory, genetics and finance. They arise broadly in statistical specially Bayesian statistics and information-theoretical contexts. When it comes real-world problems, they are used to postulate solutions to study cruise control systems in motor vehicles, queues or lines of customers arriving at an airport, exchange rates of currencies, etc. The algorithm known as PageRank, which was originally proposed for the internet search engine Google, is based on a Markov process.



No comments:
Post a Comment