Questions tagged [markov-chains]

Markov chains are systems which transition from one state to another based only upon their current state. They are used widely in various statistical domains to generate sequences based upon probabilities.

Markov chains (named after their creator, Andrey Markov) are systems which transition from one state to another based only upon their current state. They are memoryless processes which are semi-random, i.e. where each state change having an associated probability.

Due to their statical nature, Markov chains are suitable for simulating complex real-life processes where probabilities are well known. They are used in a wide variety of fields, with uses too in-depth to list here; an exhaustive list can be found on the associated Wikipedia page.

In programming, they are especially popular for manipulating human languages - Markov text generators are especially popular applications of Markov chains.

577 questions
96
votes
6 answers

Is a Markov chain the same as a finite state machine?

Is a finite state machine just an implementation of a Markov chain? What are the differences between the two?
Carson
  • 17,073
  • 19
  • 66
  • 87
75
votes
3 answers

How do Markov Chain Chatbots work?

I was thinking of creating a chatbot using something like markov chains, but I'm not entirely sure how to get it to work. From what I understand, you create a table from data with a given word and then words which follow. Is it possible to attach…
Jordan
  • 9,014
  • 8
  • 37
  • 47
62
votes
5 answers

What is the difference between markov chains and hidden markov model?

What is the difference between markov chain models and hidden markov model? I've read in Wikipedia, but couldn't understand the differences.
good_evening
  • 21,085
  • 65
  • 193
  • 298
57
votes
2 answers

Issues implementing the "Wave Collapse Function" algorithm in Python

In a nutshell: My implementation of the Wave Collapse Function algorithm in Python 2.7 is flawed but I'm unable to identify where the problem is located. I would need help to find out what I'm possibly missing or doing wrong. What is the Wave…
solub
  • 1,291
  • 17
  • 40
24
votes
5 answers

Generating Markov transition matrix in Python

Imagine I have a series of 4 possible Markovian states (A, B, C, D): X = [A, B, B, C, B, A, D, D, A, B, A, D, ....] How can I generate a Markov transformation matrix using Python? The matrix must be 4 by 4, showing the probability of moving from…
st19297
  • 521
  • 1
  • 5
  • 18
23
votes
1 answer

When to use a certain Reinforcement Learning algorithm?

I'm studying Reinforcement Learning and reading Sutton's book for a university course. Beside the classic PD, MC, TD and Q-Learning algorithms, I'm reading about policy gradient methods and genetic algorithms for the resolution of decision…
20
votes
3 answers

Simple random english sentence generator

I need a simple random English sentence generator. I need to populate it with my own words, but it needs to be capable of making longer sentences that at least follow the rules of English, even if they don't make sense. I expect there are millions…
Adam Davis
  • 91,931
  • 60
  • 264
  • 330
20
votes
3 answers

Using Markov chains (or something similar) to produce an IRC-bot

I tried google and found little that I could understand. I understand Markov chains to a very basic level: It's a mathematical model that only depends on previous input to change states..so sort of a FSM with weighted random chances instead of…
PrettyPrincessKitty FS
  • 6,117
  • 5
  • 36
  • 51
20
votes
1 answer

What is a chain in PyMC3?

I am learning PyMC3 for Bayesian modeling. You can create a model and sample with: import pandas as pd import pymc3 as pm # obs is a DataFrame with a single column, containing # the observed values for variable height obs = pd.DataFrame(...) # we…
gc5
  • 9,468
  • 24
  • 90
  • 151
20
votes
2 answers

Can an author's unique "literary style" be used to identify him/her as the author of a text?

Let's imagine, I have two English language texts written by the same person. Is it possible to apply some Markov chain algorithm to analyse each: create some kind of fingerprint based on statistical data, and compare fingerprints gotten from…
user313885
18
votes
14 answers

Any business examples of using Markov chains?

What business cases are there for using Markov chains? I've seen the sort of play area of a markov chain applied to someone's blog to write a fake post. I'd like some practical examples though? E.g. useful in business or prediction of stock…
torial
  • 13,085
  • 9
  • 62
  • 89
16
votes
1 answer

Decoding sequences in a GaussianHMM

I'm playing around with Hidden Markov Models for a stock market prediction problem. My data matrix contains various features for a particular security: 01-01-2001, .025, .012, .01 01-02-2001, -.005, -.023, .02 I fit a simple GaussianHMM: from…
datasci
  • 1,019
  • 2
  • 12
  • 29
15
votes
4 answers

Directed probability graph - algorithm to reduce cycles?

Consider a directed graph which is traversed from first node 1 to some final nodes (which have no more outgoing edges). Each edge in the graph has a probability associated with it. Summing up the probabilities to take each possible path towards all…
Kagaratsch
  • 963
  • 8
  • 18
15
votes
2 answers

How can I make a discrete state Markov model with pymc?

I am trying to figure out how to properly make a discrete state Markov chain model with pymc. As an example (view in nbviewer), lets make a chain of length T=10 where the Markov state is binary, the initial state distribution is [0.2, 0.8] and that…
shpigi
  • 185
  • 2
  • 8
14
votes
2 answers

Best way to calculate the fundamental matrix of an absorbing Markov Chain?

I have a very large absorbing Markov chain (scales to problem size -- from 10 states to millions) that is very sparse (most states can react to only 4 or 5 other states). I need to calculate one row of the fundamental matrix of this chain (the…
Ryan Marcus
  • 966
  • 8
  • 21
1
2 3
38 39