Multi-class confidence weighted algorithms

Koby Crammer, Mark Dredze, Alex Kulesza

Research output: Contribution to conferencePaperpeer-review

Abstract

The recently introduced online confidence-weighted (CW) learning algorithm for binary classification performs well on many binary NLP tasks. However, for multi-class problems CW learning updates and inference cannot be computed analytically or solved as convex optimization problems as they are in the binary case. We derive learning algorithms for the multi-class CW setting and provide extensive evaluation using nine NLP datasets, including three derived from the recently released New York Times corpus. Our best algorithm outperforms state-of-the-art online and batch methods on eight of the nine tasks. We also show that the confidence information maintained during learning yields useful probabilistic information at test time.

Original languageEnglish (US)
Pages496-504
Number of pages9
DOIs
StatePublished - 2009
Externally publishedYes
Event2009 Conference on Empirical Methods in Natural Language Processing, EMNLP 2009, Held in Conjunction with ACL-IJCNLP 2009 - Singapore, Singapore
Duration: Aug 6 2009Aug 7 2009

Conference

Conference2009 Conference on Empirical Methods in Natural Language Processing, EMNLP 2009, Held in Conjunction with ACL-IJCNLP 2009
Country/TerritorySingapore
CitySingapore
Period8/6/098/7/09

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Computer Science Applications
  • Information Systems

Fingerprint

Dive into the research topics of 'Multi-class confidence weighted algorithms'. Together they form a unique fingerprint.

Cite this