partitionMCMC {BiDAG}R Documentation

DAG structure sampling with partition MCMC


This function implements the partition MCMC algorithm for the structure learning of Bayesian networks. This procedure provides an unbiased sample from the posterior distribution of DAGs given the data. The search space can be defined either by a preliminary run of the function iterativeMCMC or by a given adjacency matrix (which can be the full matrix with zero on the diagonal, to consider the entire space of DAGs, feasible only for a limited number of nodes).


  moveprobs = NULL,
  iterations = NULL,
  stepsave = NULL,
  gamma = 1,
  verbose = FALSE,
  scoreout = FALSE,
  startspace = NULL,
  blacklist = NULL,
  scoretable = NULL,
  startDAG = NULL

## S3 method for class 'partitionMCMC'
  burnin = 0.2,
  main = "DAG logscores",
  xlab = "iteration",
  ylab = "logscore",
  type = "l",
  col = "#0c2c84"

## S3 method for class 'partitionMCMC'
print(x, ...)

## S3 method for class 'partitionMCMC'
summary(object, ...)



an object of class scoreparameters, containing the data and scoring parameters; see constructor function scoreparameters.


(optional) a numerical vector of 5 values in {0,1} corresponding to the following MCMC move probabilities in the space of partitions:

  • swap any two elements from different partition elements

  • swap any two elements in adjacent partition elements

  • split a partition element or join one

  • move a single node into another partition element or into a new one

  • stay still


integer, the number of MCMC steps, the default value is 20n^{2}\log{n}


integer, thinning interval for the MCMC chain, indicating the number of steps between two output iterations, the default is iterations/1000


tuning parameter which transforms the score by raising it to this power, 1 by default


logical, if set to TRUE (default) messages about progress will be printed


logical, if TRUE the search space and score tables are returned, FALSE by default


(optional) a square matrix, of dimensions equal to the number of nodes, which defines the search space for the order MCMC in the form of an adjacency matrix; if NULL, the skeleton obtained from the PC-algorithm will be used. If startspace[i,j] equals to 1 (0) it means that the edge from node i to node j is included (excluded) from the search space. To include an edge in both directions, both startspace[i,j] and startspace[j,i] should be 1.


(optional) a square matrix, of dimensions equal to the number of nodes, which defines edges to exclude from the search space; if blacklist[i,j]=1 it means that the edge from node i to node j is excluded from the search space


(optional) object of class scorespace containing list of score tables calculated for example by the last iteration of the function iterativeMCMC. When not NULL, parameter startspace is ignored


(optional) an adjacency matrix of dimensions equal to the number of nodes, representing a DAG in the search space defined by startspace. If startspace is defined but startDAG is not, an empty DAG will be used by default


object of class 'partitionMCMC'




number between 0 and 1, indicates the percentage of the samples which will be discarded as ‘burn-in’ of the MCMC chain; the rest of the samples will be used to calculate the posterior probabilities; 0.2 by default


name of the graph; "DAG logscores" by default


name of x-axis; "iteration"


name of y-axis; "logscore"


type of line in the plot; "l" by default


colour of line in the plot; "#0c2c84" by default


object of class 'partitionMCMC'


Object of class partitionMCMC, which contains log-score trace as well as adjacency matrix of the maximum scoring DAG, its score and the order score. Additionally, returns all sampled DAGs (represented by their adjacency matrices), their scores, orders and partitions See partitionMCMC class.


see also extractor functions getDAG, getTrace, getSpace, getMCMCscore.


Polina Suter, Jack Kuipers, the code partly derived from the partition MCMC implementation from Kuipers J, Moffa G (2017) <doi:10.1080/01621459.2015.1133426>


Kuipers J and Moffa G (2017). Partition MCMC for inference on acyclic digraphs. Journal of the American Statistical Association 112, 282-299.

Geiger D and Heckerman D (2002). Parameter priors for directed acyclic graphical models and the characterization of several probability distributions. The Annals of Statistics 30, 1412-1440.

Heckerman D and Geiger D (1995). Learning Bayesian networks: A unification for discrete and Gaussian domains. In Eleventh Conference on Uncertainty in Artificial Intelligence, pages 274-284.

Kalisch M, Maechler M, Colombo D, Maathuis M and Buehlmann P (2012). Causal inference using graphical models with the R package pcalg. Journal of Statistical Software 47, 1-26.

Kuipers J, Moffa G and Heckerman D (2014). Addendum on the scoring of Gaussian directed acyclic graphical models. The Annals of Statistics 42, 1689-1691.


## Not run: 
myScore<-scoreparameters("bge", Boston)

## End(Not run)

[Package BiDAG version 2.0.4 Index]