Abstract
Conditional inference eliminates nuisance parameters by conditioning on their sufficient statistics. For contingency tables conditional inference entails enumerating all tables with the same sufficient statistics as the observed data. For moderately sized tables and/or complex models, the computing time to enumerate these tables is often prohibitive. Monte Carlo approximations offer a viable alternative provided it is possible to obtain samples from the correct conditional distribution. This article presents an MCMC extension of the importance sampling algorithm, using a rounded normal candidate to update randomly chosen cells while leaving the remainder of the table fixed. This local approximation can greatly increase the efficiency of the rounded normal candidate. By choosing the number of cells to be updated at random, a balance is struck between dependency in the Markov chain and accuracy of the ca.
Original language | English (US) |
---|---|
Pages (from-to) | 730-745 |
Number of pages | 16 |
Journal | Journal of Computational and Graphical Statistics |
Volume | 10 |
Issue number | 4 |
DOIs | |
State | Published - Dec 2001 |
Keywords
- Batch means
- Log-linear model
- Metropolis–hastings
- Reference set
- Regeneration
ASJC Scopus subject areas
- Statistics and Probability
- Discrete Mathematics and Combinatorics
- Statistics, Probability and Uncertainty