A markov chain monte carlo algorithm for approximating exact conditional probabilities

Brian S. Caffo, James G. Booth

Research output: Contribution to journalArticlepeer-review

16 Scopus citations


Conditional inference eliminates nuisance parameters by conditioning on their sufficient statistics. For contingency tables conditional inference entails enumerating all tables with the same sufficient statistics as the observed data. For moderately sized tables and/or complex models, the computing time to enumerate these tables is often prohibitive. Monte Carlo approximations offer a viable alternative provided it is possible to obtain samples from the correct conditional distribution. This article presents an MCMC extension of the importance sampling algorithm, using a rounded normal candidate to update randomly chosen cells while leaving the remainder of the table fixed. This local approximation can greatly increase the efficiency of the rounded normal candidate. By choosing the number of cells to be updated at random, a balance is struck between dependency in the Markov chain and accuracy of the ca.

Original languageEnglish (US)
Pages (from-to)730-745
Number of pages16
JournalJournal of Computational and Graphical Statistics
Issue number4
StatePublished - Dec 2001


  • Batch means
  • Log-linear model
  • Metropolis–hastings
  • Reference set
  • Regeneration

ASJC Scopus subject areas

  • Statistics and Probability
  • Discrete Mathematics and Combinatorics
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'A markov chain monte carlo algorithm for approximating exact conditional probabilities'. Together they form a unique fingerprint.

Cite this