Markov chains

From Wikiversity
Jump to navigation Jump to search

A Markov chain is a sequence of n-grams taken out of a text, with the probability that one n-gram will follow another. For example, take the text

  This page was edited by William. Who is he?

Take the letter e. It shows up four times: once with a space after it, twice with a "d", and once with a question mark.