Is a finite state machine just an implementation of a Markov chain? What are the differences between the two?
-
31You may think a Markov chain as a FSM in which transitions are probability driven – Dr. belisarius Feb 02 '11 at 21:54
-
1Please read these papers: Links between Probabilistic Automata and Hidden Markov Models (By Pierre Dupont) http://www.info.ucl.ac.be/~pdupont/pdupont/pdf/HMM_PA_pres_n4.pdf [The Handbook of Brain Theory and Neural Networks] Hidden Markov Models and other Finite State Automata for Sequence Processing http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.85.3344&rep=rep1&type=pdf – Juan Carlos Kuri Pinto Apr 18 '14 at 07:57
6 Answers
Markov chains can be represented by finite state machines. The idea is that a Markov chain describes a process in which the transition to a state at time t+1 depends only on the state at time t. The main thing to keep in mind is that the transitions in a Markov chain are probabilistic rather than deterministic, which means that you can't always say with perfect certainty what will happen at time t+1.
The Wikipedia articles on Finite-state machines has a subsection on Finite Markov-chain processes, I'd recommend reading that for more information. Also, the Wikipedia article on Markov chains has a brief sentence describing the use of finite state machines in representing a Markov chain. That states:
A finite state machine can be used as a representation of a Markov chain. Assuming a sequence of independent and identically distributed input signals (for example, symbols from a binary alphabet chosen by coin tosses), if the machine is in state y at time n, then the probability that it moves to state x at time n + 1 depends only on the current state.

- 455
- 1
- 6
- 19

- 46,512
- 18
- 65
- 82
-
5Actually, what you are claiming here about a Markov chain is not 100% correct. What you referred here to is the "First-order Markov process". For a second-order Markov process, the next state will depend on the latest 2 time steps' states, ...... A state machine is a special case of a Markov chain; since a Markov chain is stochastic in nature. A state machine, as far as I know, is deterministic. – A. Isaac Feb 23 '13 at 19:32
-
8Unqualified, the term Markov chain means a discrete-time stochastic process with the Markov property, which means that it doesn't depend on past states. The original poster didn't ask about higher-order Markov processes, so they aren't really that relevant. Finite state machine is generally a catch all term for finite automaton, these can be either deterministic or non-deterministic in nature. – Tim Seguine Mar 08 '13 at 16:10
Whilst a Markov chain is a finite state machine, it is distinguished by its transitions being stochastic, i.e. random, and described by probabilities.

- 267,707
- 33
- 569
- 680
The two are similar, but the other explanations here are slightly wrong. Only FINITE Markov chains can be represented by a FSM. Markov chains allow for an infinite state space. As it was pointed out, the transitions of a Markov chain are described by probabilities, but it is also important to mention that the transition probabilities can only depend on the current state. Without this restriction, it would be called a "discrete time stochastic process".

- 2,887
- 25
- 38
-
-
@Michael I might be wrong, because I have been out of the topic a while, but I thought "stationary" was about time dependence. I might be mistaken, but that seems orthogonal. – Tim Seguine Jul 03 '18 at 20:33
-
"process" is typically used to express continuous time version of the term "chain" (reference: probability theory: a concise course, Rozanov) and a FSM can be represented [infinitely](https://en.wikipedia.org/wiki/Transition_system) or [event-driven](https://en.wikipedia.org/wiki/Event-driven_finite-state_machine) or [non-deterministic](https://en.wikipedia.org/wiki/Nondeterministic_finite_automaton). The only other dependence I can imagine besides state would be time. – Michael Tamillow Jul 05 '18 at 01:52
-
@Michael "process" is a generic term. It can be continuous time or discrete time. An FSM can't be represented infinitely it has the word *finite* in its name. The link you provided even says that is not a finite state machine. You brought up time dependence not me, but in discrete time processes the sequence index is normally considered the time. In that sense, yes a discrete time stochastic process would be non-stationary, but that is not descriptive enough, since it could also potentially be continuous time. I was going for a superset, not the complement in my naming. – Tim Seguine Jul 05 '18 at 07:23
I believe this should answer your question:
https://en.wikipedia.org/wiki/Probabilistic_automaton
And, you are on to the right idea - they are almost the same, subsets, supersets and modifications depending on what adjective describes the chain or the automaton. Automata typically take an input as well, but I am sure there have been papers utilizing 'Markov-chains' with inputs.
Think gaussian distribution vs. normal distribution - same ideas different fields. Automata belong to computer science, Markov belongs to probability and statistics.

- 415
- 4
- 11
I think most of the answers are not appropriate. A Markov process is generated by a (probablistic) finite state machine, but not every process generated by a probablistic finite state machine is a Markov process. E.g. Hidden Markov Processes are basically the same as processes generated by probabilistic finite state machines, but not every Hidden Markov Process is a Markov Process.

- 21
- 1
If leaving the inner working details aside, finite state machine is like a plain value, while markov chain is like a random variable (add probability on top of the plain value). So the answer to the original question is no, they are not the same. In the probabilistic sense, Markov chain is an extension of finite state machine.

- 1,571
- 1
- 20
- 22