DTMC {DTMCPack}R Documentation

Simulation of Discrete-Time/State Markov Chain

Description

This function simulates iterations through a discrete time Markov Chain. A Markov Chain is a discrete Markov Process with a state space that usually consists of positive integers. The advantage of a Markov process in a stochastic modeling context is that conditional dependencies over time are manageable because the probabilistic future of the process depends only on the present state, not the past. Therefore, if we specify an initial distribution as well as a transition matrix, we can simulate many periods into the future without any further information. Future transition probabilities can be computed by raising the transition matrix to higher-and higher powers, but this method is not numerically tractable for large matrices. My method uses a uniform random variable to iterate a user-specified number of iterations of a Markov Chain based on the transition probabilities and the initital distribution. A graphical output is also available in the form of a trace plot.

Usage

DTMC(tmat, io, N, trace)

Arguments

tmat

Transition matrix-rows must sum to 1 and the number of rows and columns must be equal.

io

Initial observation, 1 column, must sum to 1, must be the same length as transition matrix.

N

Number of simulations.

trace

Optional trace plot, specify as TRUE or FALSE.

Value

Trace

Trace-plot of the iterations through states (if selected)

State

An n x nrow(tmat) matrix detailing the iterations through each state of the Markov Chain

Author(s)

Will Nicholson

References

"Adventures in Stochastic Processes" by Sidney Resnick

See Also

MultDTMC

Examples

data(gr)
data(id)
DTMC(gr,id,10,trace=TRUE) 
# 10 iterations through "Gambler's ruin"

[Package DTMCPack version 0.1-3 Index]