Can we do without distributed models? Not in artificial grammar learning

Behavioral and Brain Sciences 23 (4):484-484 (2000)
  Copy   BIBTEX

Abstract

Page argues that localist models can be applied to a number of problems that are difficult for distributed models. However, it is easy to find examples where the opposite is true. This commentary illustrates the superiority of distributed models in the domain of artificial grammar learning, a paradigm widely used to investigate implicit learning.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 92,283

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

The many ways to distribute distributed representations.A. Mike Burton - 2000 - Behavioral and Brain Sciences 23 (4):472-473.
Sticking to the manifesto.Mike Page - 2000 - Behavioral and Brain Sciences 23 (4):496-505.
Localist but distributed representations.Stephen Grossberg - 2000 - Behavioral and Brain Sciences 23 (4):478-479.
A competitive manifesto.R. Hans Phaf & Gezinus Wolters - 2000 - Behavioral and Brain Sciences 23 (4):487-488.
Neural models of development and learning.Stephen Grossberg - 1997 - Behavioral and Brain Sciences 20 (4):566-566.
Connectionist modelling in psychology: A localist manifesto.Mike Page - 2000 - Behavioral and Brain Sciences 23 (4):443-467.

Analytics

Added to PP
2009-01-28

Downloads
24 (#660,486)

6 months
10 (#276,350)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

Learning non-local dependencies.Gustav Kuhn & Zoltán Dienes - 2008 - Cognition 106 (1):184-206.

Add more citations

References found in this work

No references found.

Add more references