GOF_far {wwntests} | R Documentation |
Goodness-of-fit test for FAR(1)
Description
The "GOF_far" test fits a FAR(1) model and then assesses the cumulative significance of lagged autocovariance operators from the model residuals, up to a user-selected maximum lag K. More specifically, it tests the null hypothesis that the first K lag-h autocovariance operators (h going from 1 to K) from the model residuals is equal to 0.
Usage
GOF_far(
f_data,
lag = 5,
M = 10000,
alpha = 0.05,
suppress_raw_output = FALSE,
suppress_print_output = FALSE
)
Arguments
f_data |
The functional data matrix with observed functions in the columns. |
lag |
Positive integer value. A user-selected maximum lag. 10 by default. |
M |
Positive integer value. Number of Monte-Carlo simulation for Welch-Satterthwaite approximation.10000 by default. |
alpha |
Numeric value between 0 and 1 specifying the significance level to be used in the specified hypothesis test. The default value is 0.05. Note, the significance value is only ever used to compute the 1-alpha quantile of the limiting distribution of the specified test's test statistic. |
suppress_raw_output |
Boolean value, FALSE by default. If TRUE, the function will not return the list containing the p-value, quantile, and statistic. |
suppress_print_output |
Boolean value, FALSE by default. If TRUE, the function will not print any output to the console. |
Details
'GOF_far' computes the goodness-of-fit test for FAR(1) over a range of user-specified lags.
Value
If suppress_raw_output = FALSE, a list containing the test statistic, the 1-alpha quantile of the limiting distribution, and the p-value computed from the specified hypothesis test. Also prints output containing a short description of the test, the p-value, and additional information about the test if suppress_print_output = FALSE.
References
[1] Kim, M., Kokoszka, P., & Rice, G. (2023). White noise testing for functional time series. Statistic Surveys, 17, 119-168.
Examples
f <- far_1_S(100, 50, 0.75)
GOF_far(f, lag=5)