Categorization as nonparametric Bayesian density estimation

Thomas L. Griffiths, Adam N. Sanborn, Kevin R. Canini and Daniel J. Navarro

in The Probabilistic Mind:

Published in print March 2008 | ISBN: 9780199216093
Published online March 2012 | e-ISBN: 9780191695971 | DOI:
Categorization as nonparametric Bayesian density estimation

Show Summary Details


The authors apply the state of the art techniques from machine learning and statistics to reconceptualize the problem of unsupervised category learning, and to relate it to previous psychologically motivated models, especially Anderson's rational analysis of categorization. The resulting analysis provides a deeper understanding of the motivations underlying the classic models of category representation, based on prototypes or exemplars, as well as shedding new light on the empirical data. Exemplar models assume that a category is represented by a set of stored exemplars, and categorizing new stimuli involves comparing these stimuli to the set of exemplars in each category. Prototype models assume that a category is associated with a single prototype and categorization involves comparing new stimuli to these prototypes. These approaches to category learning correspond to different strategies for density estimation used in statistics, being nonparametric and parametric density estimation respectively.

Keywords: machine learning; statistics; categorization; prototype models; exemplar models; density estimation; category learning

Chapter.  11148 words.  Illustrated.

Subjects: Cognitive Psychology

Full text: subscription required

How to subscribe Recommend to my Librarian

Buy this work at Oxford University Press »

Users without a subscription are not able to see the full content. Please, subscribe or login to access all content.