sensbayes {ICAOD} | R Documentation |
Verifying Optimality of Bayesian D-optimal Designs
Description
Plots the sensitivity (derivative) function and calculates the efficiency lower bound (ELB) for a given Bayesian design.
Let \boldsymbol{x}
belongs to \chi
that denotes the design space.
Based on the general equivalence theorem, a design \xi^*
is optimal if and only if the value of the sensitivity (derivative) function
is non-positive for all \boldsymbol{x}
in \chi
and zero when
\boldsymbol{x}
belongs to the support of \xi^*
(be equal to the one of the design points).
For an approximate (continuous) design, when the design space is one or two-dimensional, the user can visually verify the optimality of the design by observing the sensitivity plot. Furthermore, the proximity of the design to the optimal design can be measured by the ELB without knowing the latter.
Usage
sensbayes(
formula,
predvars,
parvars,
family = gaussian(),
x,
w,
lx,
ux,
fimfunc = NULL,
prior = list(),
sens.control = list(),
sens.bayes.control = list(),
crt.bayes.control = list(),
plot_3d = c("lattice", "rgl"),
plot_sens = TRUE,
npar = NULL,
calculate_criterion = TRUE,
silent = FALSE,
crtfunc = NULL,
sensfunc = NULL
)
Arguments
formula |
A linear or nonlinear model |
predvars |
A vector of characters. Denotes the predictors in the |
parvars |
A vector of characters. Denotes the unknown parameters in the |
family |
A description of the response distribution and the link function to be used in the model.
This can be a family function, a call to a family function or a character string naming the family.
Every family function has a link argument allowing to specify the link function to be applied on the response variable.
If not specified, default links are used. For details see |
x |
A vector of candidate design (support) points.
When is not set to |
w |
Vector of the corresponding design weights for |
lx |
Vector of lower bounds for the predictors. Should be in the same order as |
ux |
Vector of upper bounds for the predictors. Should be in the same order as |
fimfunc |
A function. Returns the FIM as a |
prior |
An object of class |
sens.control |
Control Parameters for Calculating the ELB. For details, see |
sens.bayes.control |
A list. Control parameters to verify the general equivalence theorem. For details, see |
crt.bayes.control |
A list. Control parameters to approximate the integral in the Bayesian criterion at a given design over the parameter space.
For details, see |
plot_3d |
Which package should be used to plot the sensitivity (derivative) function for two-dimensional design space. Defaults to |
plot_sens |
Plot the sensitivity (derivative) function? Defaults to |
npar |
Number of model parameters. Used when |
calculate_criterion |
Calculate the optimality criterion? See 'Details' of |
silent |
Do not print anything? Defaults to |
crtfunc |
(Optional) a function that specifies an arbitrary criterion. It must have especial arguments and output. See 'Details' of |
sensfunc |
(Optional) a function that specifies the sensitivity function for |
Details
Let \Xi
be the space of all approximate designs with
k
design points (support points) at x_1, x_2, ..., x_k
from design space \chi
with
corresponding weights w_1, . . . ,w_k
.
Let M(\xi, \theta)
be the Fisher information
matrix (FIM) of a k-
point design \xi
and \pi(\theta)
is a user-given prior distribution for the vector of unknown parameters \theta
.
A design \xi^*
is Bayesian D-optimal among all designs on \chi
if and only if the following inequality holds for all \boldsymbol{x} \in \chi
c(\boldsymbol{x}, \xi^*) = \int_{\theta \in Theta}tr M^{-1}(\xi^*, \theta)I(\boldsymbol{x}, \theta)-p \pi(\theta) d\theta\leq 0,
with equality at all support points of \xi^*
.
Here, p
is the number of model parameters.
c(\boldsymbol{x},\xi^*)
is
called sensitivity or derivative function.
Depending on the complexity of the problem at hand, sometimes, the CPU time can be considerably reduced
by choosing a set of less conservative values for the tuning parameters tol
and maxEval
in
the function sens.bayes.control
when sens.bayes.control$method = "cubature"
.
Similarly, this applies when sens.bayes.control$method = "quadrature"
.
In general, if the CPU time matters, the user should find an appropriate speed-accuracy trade-off for her/his own problem.
See 'Examples' for more details.
Note
The default values of the tuning parameters in sens.bayes.control
are set in a way that
having accurate plots for the sensitivity (derivative) function
and calculating the ELB to a high precision to be the primary goals,
although the process may take too long. The user should choose a set of less conservative values
via the argument sens.bayes.control
when the CPU-time is too long or matters.
Examples
##################################################################
# Checking the Bayesian D-optimality of a design for the 2Pl model
##################################################################
skew2 <- skewnormal(xi = c(0, 1), Omega = matrix(c(1, -0.17, -0.17, .5), nrow = 2),
alpha = c(-1, 0), lower = c(-3, .1), upper = c(3, 2))
## Not run:
sensbayes(formula = ~1/(1 + exp(-b *(x - a))),
predvars = "x", parvars = c("a", "b"),
family = binomial(),
x= c(-2.50914, -1.16780, -0.36904, 1.29227),
w =c(0.35767, 0.11032, 0.15621, 0.37580),
lx = -3, ux = 3,
prior = skew2)
# took 29 seconds on my system!
## End(Not run)
# It took very long.
# We re-adjust the tuning parameters in sens.bayes.control to be faster
# See how we drastically reduce the maxEval and increase the tolerance
## Not run:
sensbayes(formula = ~1/(1 + exp(-b *(x - a))),
predvars = "x", parvars = c("a", "b"),
family = binomial(),
x= c(-2.50914, -1.16780, -0.36904, 1.29227),
w =c(0.35767, 0.11032, 0.15621, 0.37580),
lx = -3, ux = 3,prior = skew2,
sens.bayes.control = list(cubature = list(tol = 1e-4, maxEval = 300)))
# took 5 Seconds on my system!
## End(Not run)
# Compare it with the following:
sensbayes(formula = ~1/(1 + exp(-b *(x - a))),
predvars = "x", parvars = c("a", "b"),
family = binomial(),
x= c(-2.50914, -1.16780, -0.36904, 1.29227),
w =c(0.35767, 0.11032, 0.15621, 0.37580),
lx = -3, ux = 3,prior = skew2,
sens.bayes.control = list(cubature = list(tol = 1e-4, maxEval = 200)))
# Look at the plot!
# took 3 seconds on my system
########################################################################################
# Checking the Bayesian D-optimality of a design for the 4-parameter sigmoid emax model
########################################################################################
lb <- c(4, 11, 100, 5)
ub <- c(9, 17, 140, 10)
## Not run:
sensbayes(formula = ~ theta1 + (theta2 - theta1)*(x^theta4)/(x^theta4 + theta3^theta4),
predvars = c("x"), parvars = c("theta1", "theta2", "theta3", "theta4"),
x = c(0.78990, 95.66297, 118.42964,147.55809, 500),
w = c(0.23426, 0.17071, 0.17684, 0.1827, 0.23549),
lx = .001, ux = 500, prior = uniform(lb, ub))
# took 200 seconds on my system
## End(Not run)
# Re-adjust the tuning parameters to have it faster
## Not run:
sensbayes(formula = ~ theta1 + (theta2 - theta1)*(x^theta4)/(x^theta4 + theta3^theta4),
predvars = c("x"), parvars = c("theta1", "theta2", "theta3", "theta4"),
x = c(0.78990, 95.66297, 118.42964,147.55809, 500),
w = c(0.23426, 0.17071, 0.17684, 0.1827, 0.23549),
lx = .001, ux = 500, prior = uniform(lb, ub),
sens.bayes.control = list(cubature = list(tol = 1e-3, maxEval = 300)))
# took 4 seconds on my system. See how much it makes difference
## End(Not run)
## Not run:
# Now we try it with quadrature. Default is 6 nodes
sensbayes(formula = ~ theta1 + (theta2 - theta1)*(x^theta4)/(x^theta4 + theta3^theta4),
predvars = c("x"), parvars = c("theta1", "theta2", "theta3", "theta4"),
x = c(0.78990, 95.66297, 118.42964,147.55809, 500),
w = c(0.23426, 0.17071, 0.17684, 0.1827, 0.23549),
sens.bayes.control = list(method = "quadrature"),
lx = .001, ux = 500, prior = uniform(lb, ub))
# 166.519 s
# use less number of nodes to see if we can reduce the CPU time
sensbayes(formula = ~ theta1 + (theta2 - theta1)*(x^theta4)/(x^theta4 + theta3^theta4),
predvars = c("x"), parvars = c("theta1", "theta2", "theta3", "theta4"),
x = c(0.78990, 95.66297, 118.42964,147.55809, 500),
w = c(0.23426, 0.17071, 0.17684, 0.1827, 0.23549),
sens.bayes.control = list(method = "quadrature",
quadrature = list(level = 3)),
lx = .001, ux = 500, prior = uniform(lb, ub))
# we don't have an accurate plot
# use less number of levels: use 4 nodes
sensbayes(formula = ~ theta1 + (theta2 - theta1)*(x^theta4)/(x^theta4 + theta3^theta4),
predvars = c("x"), parvars = c("theta1", "theta2", "theta3", "theta4"),
x = c(0.78990, 95.66297, 118.42964,147.55809, 500),
w = c(0.23426, 0.17071, 0.17684, 0.1827, 0.23549),
sens.bayes.control = list(method = "quadrature",
quadrature = list(level = 4)),
lx = .001, ux = 500, prior = uniform(lb, ub))
## End(Not run)