Generalized Bayes Quantification Learning under Dataset Shift

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Quantification learning is the task of prevalence estimation for a test population using predictions from a classifier trained on a different population. Quantification methods assume that the sensitivities and specificities of the classifier are either perfect or transportable from the training to the test population. These assumptions are inappropriate in the presence of dataset shift, when the misclassification rates in the training population are not representative of those for the test population. Quantification under dataset shift has been addressed only for single-class (categorical) predictions and assuming perfect knowledge of the true labels on a small subset of the test population. We propose generalized Bayes quantification learning (GBQL) that uses the entire compositional predictions from probabilistic classifiers and allows for uncertainty in true class labels for the limited labeled test data. Instead of positing a full model, we use a model-free Bayesian estimating equation approach to compositional data using Kullback–Leibler loss-functions based only on a first-moment assumption. The idea will be useful in Bayesian compositional data analysis in general as it is robust to different generating mechanisms for compositional data and allows 0’s and 1’s in the compositional outputs thereby including categorical outputs as a special case. We show how our method yields existing quantification approaches as special cases. Extension to an ensemble GBQL that uses predictions from multiple classifiers yielding inference robust to inclusion of a poor classifier is discussed. We outline a fast and efficient Gibbs sampler using a rounding and coarsening approximation to the loss functions. We establish posterior consistency, asymptotic normality and valid coverage of interval estimates from GBQL, which to our knowledge are the first theoretical results for a quantification approach in the presence of local labeled data. We also establish finite sample posterior concentration rate. Empirical performance of GBQL is demonstrated through simulations and analysis of real data with evident dataset shift. Supplementary materials for this article are available online.

Original languageEnglish (US)
Pages (from-to)2163-2181
Number of pages19
JournalJournal of the American Statistical Association
Issue number540
StatePublished - 2022


  • Bayesian
  • Compositional data
  • Estimating equations
  • Machine learning
  • Quantification

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Generalized Bayes Quantification Learning under Dataset Shift'. Together they form a unique fingerprint.

Cite this