Course: Connectionist Language Processing

Lectures: Tues 14-16 in Psycholinguistics Meeting Room (C7.1 top floor)
Tutorials: Thurs 14-16 in TBA
Begin: Tues 16.04.19
Exam: Tues 16.07.2019 @ 14:00 (sharp),

Topic: This course will examine neurocomputational (or connectionist) models of human language processing. We will start from biological neurons, and show how their processing behavior can be modeled mathematically. The resulting artificial neurons will then be wired together to form artificial neural networks, and we will discuss how such networks can be applied to build neurocomputational models of language learning and language processing. It will be shown that such models effectively all share the same computational principles, and that any differences in their behavior is driven by differences in the representations that they process and construct. Near the end of the course, we will use the accumulated knowledge to construct a psychologically plausible neurocomputational model of incremental (word-by-word) language comprehension that constructs a rich utterance representation beyond a simple syntactic derivation or semantic formula.

The Basics:
  • Modeling neural information processing (Connectionism)
  • Two-layer neural networks and their properties (The Perceptron)
  • Multi-layer neural networks: Towards internal representations (Multi-layer Perceptrons)
  • Neural information encoding: Localist versus Distributed schemes (Representations)

Models of Language:
  • Modeling the acquisition of the English past-tense and reading aloud
  • Processing sequences: Simple Recurrent Networks (SRNs)
  • Modeling the acquisition of hierarchical syntactic knowledge

Advanced topics:
  • Richer representations for sentence understanding (beyond syntactic
  • derivations and semantic formulas)
  • Neurobiological plausibility of connectionism
On-line Course Materials:
  • Lecture 1: Intro to connectionism and the brain [pdf](tutorial1)
  • Lecture 2: A primer on linear algebra [pdf]
  • Lecture 3: Learning in single-layer networks[pdf]
  • Lecture 4: Learning in multi-layer networks [pdf]
  • Lecture 5: Acquisition of the English Past Tense [pdf]
  • Lecture 6: Pattern association networks, Scientific American article (for tutorial) [pdf]
  • Lecture 7: Simple Recurrent Networks 1 [pdf]
  • Lecture 8: Simple Recurrent Networks 2 [pdf]
  • Tutorial Talk: Modeling Language Production
  • Lecture 9: Modeling the Electrophysiology of Language I[pdf]
  • Lecture 10: Situation Modeling using Microworlds [pdf]
  • Lecture 11: Modeling Event-Drive Surprisal in Language Comprehension[pdf]
  • Lecture 12: Course summary, Projects

Requirements: While the course does not involve programming, students should be comfortable with basic concepts of linear algebra.


P. McLeod, K. Plunkett and E. T. Rolls (1998). Introduction to Connectionist Modelling of Cognitive Processes. Oxford University Press. Chapters: 1-5, 7, 9.
K. Plunkett and J. Elman (1997). Exercises in rethinking innateness: A Handbook for Connectionist Simulations. MIT Press. Chapters: 1-8, 11, 12.
J. Elman (1990). Finding Structure in Time. Cognitive Science, 14: 179-211.
J. Elman (1991). Distributed representations, simple recurrent networks, and grammatical structure. Machine Learning, 7: 195-225.

Additional readings:

N. Chater and M. Christiansen (1999). Connectionism and natural language processing. Chapter 8 of Garrod and Pickering (eds.): Language Processing. Psychology Press.
M. Christiansen and N. Chater (1999). Connectionist Natural Language Processing: The State of the Art. Cognitive Science, 23(4): 417-437.
J. Elman et al. (1996). Chapter 2: Why Connectionism? In: Rethinking Innateness. MIT Press.
J. Elman (1993). Learning and development in neural networks: The importance of starting small. Cognition, 48: 71-99.
M. Seidenberg and M. MacDonald (1999). A Probabilistic Constraints Approach to Language Acquisition and Processing. Cognitive Science, 23(4): 569-588.
M. Steedman (1999). Connectionist Sentence Processing in Perspective. Cognitive Science, 23(4): 615-634.