optim.basis {fda.usc}R Documentation

Select the number of basis using GCV method.

Description

Functional data estimation via basis representation using cross-validation (CV) or generalized cross-validation (GCV) method with a roughness penalty.

Usage

optim.basis(
  fdataobj,
  type.CV = GCV.S,
  W = NULL,
  lambda = 0,
  numbasis = floor(seq(ncol(fdataobj)/16, ncol(fdataobj)/2, len = 10)),
  type.basis = "bspline",
  par.CV = list(trim = 0, draw = FALSE),
  verbose = FALSE,
  ...
)

Arguments

fdataobj

fdata class object.

type.CV

Type of cross-validation. By default generalized cross-validation (GCV) method.

W

Matrix of weights.

lambda

A roughness penalty. By default, no penalty lambda=0.

numbasis

Number of basis to use.

type.basis

Character string which determines type of basis. By default "bspline".

par.CV

List of parameters for type.CV: trim, the alpha of the trimming and draw=TRUE.

verbose

If TRUE information about GCV values and input parameters is printed. Default is FALSE.

...

Further arguments passed to or from other methods. Arguments to be passed by default to create.basis.

Details

Provides the least GCV for functional data for a list of number of basis num.basis and lambda values lambda. You can define the type of CV to use with the type.CV, the default is used GCV.S.

Smoothing matrix is performed by S.basis. W is the matrix of weights of the discretization points.

Value

Note

min.basis deprecated.

Author(s)

Manuel Febrero-Bande, Manuel Oviedo de la Fuente manuel.oviedo@udc.es

References

Ramsay, James O., and Silverman, Bernard W. (2006), Functional Data Analysis, 2nd ed., Springer, New York.

Wasserman, L. All of Nonparametric Statistics. Springer Texts in Statistics, 2006.

Hardle, W. Applied Nonparametric Regression. Cambridge University Press, 1994.

Febrero-Bande, M., Oviedo de la Fuente, M. (2012). Statistical Computing in Functional Data Analysis: The R Package fda.usc. Journal of Statistical Software, 51(4), 1-28. https://www.jstatsoft.org/v51/i04/

See Also

See Also as S.basis.
Alternative method: optim.np

Examples

## Not run:  
a1<-seq(0,1,by=.01)
a2=rnorm(length(a1),sd=0.2)
f1<-(sin(2*pi*a1))+rnorm(length(a1),sd=0.2)
nc<-50
np<-length(f1)
tt=1:101
S<-S.NW(tt,2)
mdata<-matrix(NA,ncol=np,nrow=50)
for (i in 1:50) mdata[i,]<- (sin(2*pi*a1))+rnorm(length(a1),sd=0.2)
mdata<-fdata(mdata)
nb<-floor(seq(5,29,len=5))
l<-2^(-5:15)
out<-optim.basis(mdata,lambda=l,numbasis=nb,type.basis="fourier")
matplot(t(out$gcv),type="l",main="GCV with fourier basis")

# out1<-optim.basis(mdata,type.CV = CV.S,lambda=l,numbasis=nb)
# out2<-optim.basis(mdata,lambda=l,numbasis=nb)

# variance calculations
y<-mdata
i<-3
z=qnorm(0.025/np)
fdata.est<-out$fdata.est
var.e<-Var.e(mdata,out$S.opt)
var.y<-Var.y(mdata,out$S.opt)
var.y2<-Var.y(mdata,out$S.opt,var.e)

# estimated fdata and point confidence interval
upper.var.e<-out$fdata.est[["data"]][i,]-z*sqrt(diag(var.e))
lower.var.e<-out$fdata.est[["data"]][i,]+z*sqrt(diag(var.e))
dev.new()
plot(y[i,],lwd=1,ylim=c(min(lower.var.e),max(upper.var.e)))
lines(out$fdata.est[["data"]][i,],col=gray(.1),lwd=1)
lines(out$fdata.est[["data"]][i,]+z*sqrt(diag(var.y)),col=gray(0.7),lwd=2)
lines(out$fdata.est[["data"]][i,]-z*sqrt(diag(var.y)),col=gray(0.7),lwd=2)
lines(upper.var.e,col=gray(.3),lwd=2,lty=2)
lines(lower.var.e,col=gray(.3),lwd=2,lty=2)
legend("top",legend=c("Var.y","Var.error"), col = c(gray(0.7),
gray(0.3)),lty=c(1,2))

## End(Not run)


[Package fda.usc version 2.1.0 Index]