carfima {carfima} | R Documentation |
Fitting a CARFIMA(p, H, q) model via frequentist or Bayesian machinery
Description
A general-order CARFIMA() model for
is
where is the standard fractional Brownian motion,
is the Hurst parameter, and the superscript
indicates
-fold differentiation with respect to
; see Equation (1) of Tsai and Chan (2005) for details. The model has
unknown model parameters;
's,
's,
, and
. Also, the model can account for heteroscedastic measurement errors, if the information about measurement error standard deviations is known.
The function carfima
fits the model, producing either their maximum likelihood estimates (MLEs) with their asymptotic uncertainties or their posterior samples according to its argument, method
. The MLEs except are obtained from a profile likelihood by a global optimizer called the differential evolution algorithm on restricted ranges, i.e., (-0.99, -0.01) for each
, (0, 100) for each
, and (0.51, 0.99) for
; the MLE of
is then deterministically computed. The corresponding asymptotic uncertainties are based on a numerical hessian matrix calculation at the MLEs (see function
hessian
in pracma). It also computes the Akaike Information Criterion (AIC) that is (log likelihood
). The function
carfima
becomes numerically unstable if , and thus it may produce numerical errors.
The Bayesian approach uses independent prior distributions for the unknown model parameters; a Uniform(-0.9999, -0.0001) prior for each , a Uniform(0, 100) prior for each
, a Uniform(0.5001, 0.9999) prior for
for long memory process, and finally an inverse-Gamma(shape = 2.01, scale =
) prior for
. Posterior propriety holds because the prior distributions are jointly proper. It also adopts appropriate proposal density functions; a truncated Normal(current state, proposal scale) between -0.9999 and -0.0001 for each
, a truncated Normal(current state, proposal scale) between 0 and 100 for each
, a truncated Normal(current state, proposal scale) between 0.5001 and 0.9999 for
, and fianlly a Normal(log(current state), proposal scale on a log scale) for
, i.e.,
is updated on a log scale. We sample the full posterior using Metropolis-Hastings within Gibbs sampler. It also adopts adaptive Markov chain Monte Carlo (MCMC) that updates the proposal scales every 100 iterations; if the acceptance rate of the most recent 100 proposals (at the end of the
th 100 iterations) smaller than 0.3, then it multiplies
by the current proposal scale; if it is larger than 0.3, then it multiplies
by the current proposal scale. The resulting Markov chain with this adaptive scheme converge to the stationary distribution because the adjustment factor,
, approaches unity as
goes to infinity, satisfying the diminishing adaptation condition. The function
carfima
becomes numerically unstable if , and thus it may produce numerical errors. The output returns the AIC for which we evaluate the log likelihood at the posterior medians of the unknown model parameters.
Usage
carfima(Y, time, measure.error, ar.p, ma.q, method = "mle",
bayes.param.ini, bayes.param.scale, bayes.n.warm, bayes.n.sample)
Arguments
Y |
A vector of length |
time |
A vector of length |
measure.error |
(Optional) A vector for the |
ar.p |
A positive integer for the order of the AR model. |
ma.q |
A non-negative integer for the order of the MA model. |
method |
Either "mle" or "bayes". Method "mle" conducts the MLE-based inference, producing MLEs and asymptotic uncertainties of the model parameters. Method "bayes" draws posterior samples of the model parameters. |
bayes.param.ini |
Only if |
bayes.param.scale |
Only if |
bayes.n.warm |
Only if |
bayes.n.sample |
Only if |
Details
The function carfiam
produces MLEs, their asymptotic uncertainties, and AIC if method
is "mle". It produces the posterior samples of the model parameters, acceptance rates, and AIC if method
is "bayes".
Value
The outcome of carfima
is composed of:
mle |
If |
se |
If |
param |
If |
accept |
If |
AIC |
For both methods. Akaike Information Criterion, -2(log likelihood |
fitted.values |
For both methods. A vector of length |
Author(s)
Hyungsuk Tak, Henghsiu Tsai, Kisung You
References
H. Tsai and K.S. Chan (2005) "Maximum Likelihood Estimation of Linear Continuous Time Long Memory Processes with Discrete Time Data," Journal of the Royal Statistical Society (Series B), 67 (5), 703-716. DOI: 10.1111/j.1467-9868.2005.00522.x
H. Tsai and K.S. Chan (2000) "A Note on the Covariance Structure of a Continuous-time ARMA Process," Statistica Sinica, 10, 989-998.
Link: http://www3.stat.sinica.edu.tw/statistica/j10n3/j10n317/j10n317.htm
Examples
##### Irregularly spaced observation time generation.
length.time <- 30
time.temp <- rexp(length.time, rate = 2)
time <- rep(NA, length.time + 1)
time[1] <- 0
for (i in 2 : (length.time + 1)) {
time[i] <- time[i - 1] + time.temp[i - 1]
}
time <- time[-1]
##### Data genration for CARFIMA(1, H, 0) based on the observation times.
parameter <- c(-0.4, 0.8, 0.2)
# AR parameter alpha = -0.4
# Hurst parameter = 0.8
# Process uncertainty (standard deviation) sigma = 0.2
me.sd <- rep(0.05, length.time)
# Known measurement error standard deviations 0.05 for all observations
# If not known, remove the argument "measure.error = me.sd" in the following codes,
# so that the default values (zero) are automatically assigned.
y <- carfima.sim(parameter = parameter, time = time,
measure.error = me.sd, ar.p = 1, ma.q = 0)
##### Fitting the CARFIMA(1, H, 0) model on the simulated data for MLEs.
res <- carfima(Y = y, time = time, measure.error = me.sd,
method = "mle", ar.p = 1, ma.q = 0)
# It takes a long time due to the differential evolution algorithm (global optimizer).
# res$mle; res$se; res$AIC; res$fitted.values
##### Fitting the CARFIMA(1, H, 0) model on the simulated data for Bayesian inference.
res <- carfima(Y = y, time = time, measure.error = me.sd,
method = "bayes", ar.p = 1, ma.q = 0,
bayes.param.ini = parameter,
bayes.param.scale = c(rep(0.2, length(parameter))),
bayes.n.warm = 100, bayes.n.sample = 1000)
# It takes a long time because the likelihood evaluation is computationally heavy.
# The last number of bayes.param.scale is to update sigma2 (not sigma) on a log scale.
# hist(res$param[, 1]); res$accept; res$AIC; res$fitted.values
##### Computing the log likelihood of the CARFIMA(1, H, 0) model given the parameters.
loglik <- carfima.loglik(Y = y, time = time, ar.p = 1, ma.q = 0,
measure.error = me.sd,
parameter = parameter, fitted = FALSE)