Learning nested concept classes with limited storage

David Heath, Simon Kasif, Rao Kosaraju, Steven Salzberg, Gregory Sullivan

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Resource limitations have played an important role in the development and verification of theories about intelligent behaviour. This paper is a step towards answering the question of what effect limited memory has on the ability of intelligent machines to learn from data. Our analysis is applicable to many existing learning methods, especially those that incrementally construct a generalization by making repeated passes through a set of training data (e.g. some implementations of perceptrons, neural nets, and decision trees). Most of these methods do not store the entire training set, since they allow themselves only limited storage, a restriction that forces them to produce a compressed representation. The question we address is, how much (if any) additional processing time is required for methods with limited storage ? We measure processing time for learning algorithms by the number of passes through a data set necessary to obtain a correct generalization. Researchers have observed that for some learning methods (e.g. neural nets) the number of passes through a data set gets smaller as the size of the network is increased ; however, no analytical study that explains this behaviour has been published. We examine limited storage algorithms for a particular concept class, nested hyperrectangles. We prove bounds that illustrate the fundamental trade-off between storage requirements and processing time required to learn an optimal structure. It turns out that our lower bounds apply to other algorithms and concept classes as well (e.g. decision trees). We discuss applications of our analysis to learning and to problems in human perception. We also briefly discuss parallel learning algorithms.

Original languageEnglish (US)
Pages (from-to)129-147
Number of pages19
JournalJournal of Experimental and Theoretical Artificial Intelligence
Volume8
Issue number2
DOIs
StatePublished - Apr 1 1996

Keywords

  • Incremental learning learnability

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Learning nested concept classes with limited storage'. Together they form a unique fingerprint.

Cite this