calculate_lhsOpt {sgsR} | R Documentation |
Analyze optimal Latin hypercube sample number
Description
Population level analysis of metric raster data to determine optimal Latin Hypercube sample size
Usage
calculate_lhsOpt(
mats,
PCA = TRUE,
quant = TRUE,
KLdiv = TRUE,
minSamp = 10,
maxSamp = 100,
step = 10,
rep = 10,
iter = 10000
)
Arguments
mats |
List. Output from |
PCA |
Logical. Calculates principal component loadings of the population for PCA similarity factor testing.
|
quant |
Logical. Perform quantile comparison testing. |
KLdiv |
Logical. Perform Kullback–Leibler divergence testing. |
minSamp |
Numeric. Minimum sample size to test. |
maxSamp |
Numeric. Maximum sample size to test. |
step |
Numeric. Sample step size for each iteration. |
rep |
Numeric. Internal repetitions for each sample size. |
iter |
Positive Numeric. The number of iterations for the Metropolis-Hastings
annealing process. Defaults to |
Value
data.frame with summary statistics.
Note
Special thanks to Dr. Brendan Malone for the original implementation of this algorithm.
Author(s)
Tristan R.H. Goodbody
References
Malone BP, Minasny B, Brungard C. 2019. Some methods to improve the utility of conditioned Latin hypercube sampling. PeerJ 7:e6451 DOI 10.7717/peerj.6451
Examples
## Not run:
#--- Load raster and access files ---#
r <- system.file("extdata", "mraster.tif", package = "sgsR")
mr <- terra::rast(r)
#--- calculate lhsPop details ---#
mats <- calculate_pop(mraster = mr)
calculate_lhsOpt(mats = mats)
calculate_lhsOpt(
mats = mats,
PCA = FALSE,
iter = 200
)
## End(Not run)