📚 noun • entry_id 11983
Markov chain
Meanings (ES + gloss)
cadena de Márkov
A discrete-time stochastic process containing a Markov property.
The probability density of the Bayseian posterior was estimated by Metropolis-coupled Markov chain Monte Carlo, with multiple incrementally heated chains.
Phrases
No hay frases
Word forms