Computational Psycholinguistics

Lecturer: Matthew Crocker
Format: Lectures with Tutorials (4 SWS, 6LP)
Programme: MSc in LS&T, Diplom CL, BA in CL

Times: Mon 14-16 (Lecture), Wed 14-16 (Tutorial)
Location: Room TBA
Language of Instruction: English
Course begin: Monday, 24 April 2017

Course Contents

This course will discuss current computational models of human language processing. We will consider both how computational linguistics can inform the development of psycholinguistic theories, and also how computational models can account for and explain (experimentally) observed human language processing behaviour. The course will begin with an introduction to psycholinguistic research, summarising both the key observations about human language understanding, and also presenting central theoretical debates including issues such as modularity, incrementality, and the psychological status of linguistic principles and representations. We will then consider a number of computational models of lexical and sentence level processing and language acquisition. The models covered exploit symbolic, probabilistic, connectionist, and also 'hybrid' computational mechanisms.

Week Monday Wednesday
1 Introduction: human performance, competence-performance, modeling. [Lecture1] Tutorial 0: Introduction to Prolog.
2 Parsing and Psychological Reality: incrementality, memory load, and disambiguation. Implementing top-down, shift-reduce, and left-corner models.[Lecture2] Tutorial 1: Parsing in Prolog.
3 Syntactic processing 2: Grammatical models, Long-distance dependencies. [Lecture3] Tutorial 2: Discussion of Tutorial 1.
4 Syntactic processing 3: Reanalysis and Monotonic Parsing.[Lecture4] Tutorial 3: Incremental parsing in Prolog. [Trees PDF]
5 Probabilistic Models 1: Rational approaches to language processing, category disambiguation. [Lecture5] Tutorial 4: Statistical lexical category disambiguation.
6 Probabilistic Models 2: Probabilistic models of category disambiguation, continued. [Lecture6] Tutorial 5: Statistical lexical category disambiguation, continued.
7 Probabilistic Parsing 1: Jurafsky, Brants and Crocker. [Lecture7] Tutorial 6: Statistical lexical category disambiguation, final.
8 Probabilistic Parsing 2: Lecture postponed. No tutorial this week.
Christmas Break Christmas Break
9 Constraint-based Models 1: McRae et al. [Lecture8] Tutorial 7: McRae model
10 Constraint-based Models 2: Green & Mitchell. [Lecture9] Tutorial 8: Green & Mitchell
11 Probabilistic Parsing 2: Crocker and Brants, Informativity. [Lecture10] Tutorial 9: Probabilistic parsing with the Roark parser.
12 Rational analysis: Surprisal and Prediction Theory. [Lecture11] Tutorial 10: Surprisal in the Roark parser.
13 Course review: [Lecture12] Tutorial wrap-up
14 Office hour: Tues, Feb 9 @ 15:00 EXAM: Wed, Feb 10 @ 14:00, Seminar Room (not 2.11 !!)


Tutorials
Files for the tutorials will appear here, as the course goes on

Software
The course will use several systems for experimenting with computational models of human language processing.
  • Prolog implementations of incremental parsers. You can get SWI-Prolog [here], and find online tutorials [here]
  • Probabilistic models of lexical and syntactic processing
  • Tlearn: for simple connectionist models

Systems are freely available, for Mac OS, Linux, and Windows operating systems.

Course Readings


Additional Literature (not relevant for 2015-16).