skip to primary navigation skip to content

CBSU bibliography search

To request a reprint of a CBSU publication, please click here to send us an email (reprints may not be available for all publications)

Sequence detector networks and associative learning of grammatical categories
In Biomimetic neural learning for intelligent robots - Wermter, S., Palm, G., Elshaw, M., 31 - 53
Year of publication:
CBU number:
A fundamental prerequisite for language is the ability to distinguish word sequences that are grammatically well-formed from ungrammatical word strings and to generalise rules of syntactic serial order to new strings of constituents. In this work, we extend a neural model of syntactic brain mechanisms that is based on syntactic sequence detectors (SDs). Elementary SDs are neural units that specically respond to a sequence of constituent words AB, but not (or much less) to the reverse sequence BA. We discuss limitations of the original version of the SD model(Pulvermuller, Theory in Biosciences, 2003) and suggest optimal model variants taking advantage of optimised neuronal response functions, non-linear interaction between inputs, and leaky integration of neuronal input accumulating over time. A biologically more realistic model variant including a network of several SDs is used to demonstrate that associative Hebb-like synaptic plasticity leads to learning of word sequences, formation of neural representations of grammatical categories, and linking of sequence detectors into neuronal assemblies that may provide a biological basis of syntactic rule knowledge. We propose that these syntactic neuronal assemblies (SNAs) underlie generalisation of syntactic regularities from already encountered strings to new grammatical word