Vera Demberg - Teaching


Summer Semester 2011

Seminar on Incremental Processing

Incremental processing in its most strict interpretation means that information is fully and eagerly integrated with previously processed material as soon as it is available, i.e. hypotheses about syntactic structures and semantic roles are made as soon as each word is encountered. On the syntax level, strict incrementality would for example mean that each word is integrated into a syntactic structure immediately, i.e. the processor would not use any stack to keep seen words for later decision taking. The opposite of incremental processing would be a machine that first reads in the whole sentence, then decide on part-of-speech tags, then builds up a syntactic structure bottom-up, and only starts assigning semantic roles once syntactic structures are completed.

This course will look at incremental processing from two different viewpoints: 1) is there evidence (or counter-evidence) for strict incremental processing in human sentence processing? 2) Incrementality from an NLP perspective: How can incremental processing be realized, what are the advantages and disadvantages of processing incrementally, and what applications can profit from incremental processing?

The new time slot of the seminar is from 12:30s.t. to 14:00.

New: We're going to have the additional session on MAY 28th, 2-6pm.

Also, I confirmed that you can choose to do an oral exam for this course. Oral exams will take place some time during the first two weeks of August or the last week of June / first week of July.

Download a copy of the anonymous peer review form.

Introduction: The strict competence hypothesis

based on

Psycholinguistic Experiments on Incrementality

Topic PSY1: Evidence for/against strict incrementality (meaning that each word in a sentence is fully integrated as soon as it is encountered; choose 2) Topic PSY2: Incrementality and Prediction (choose 2:) Topic PSY3: Local Coherence Effects Topic PSY4: Incremental Production? Topic PSY5: Arguments for incrementality from a linguistic view point (strong linguistics background required)

Incrementality in NLP

Topic NLP1: Incremental Parsing with a PCFG Topic NLP2: Incremental Parsing with Dependency Grammars Topic NLP3: Incremental Parsing with Tree-Adjoining Grammars (choose 3a or 3b) Topic NLP4: Incremental Semantic Parsing Topic NLP5: Incremental Parsing with Cascaded / Hierarchical HMMs

Incremental processing in NLP Applications

Topic APP1: Dialogue systems / interactions with agents Topic APP2: Machine translation (Familiarity with CCG advantageous for this topic) Topic APP3: Trade-off between incrementality (speed) and accuracy: Topic APP4: Speech Recognition
final Calendar:
Date Topic
14.04. Introduction to the course topic; Organization
21.04. Historic Background (Vera Demberg): Incremental Interpretation and the Strict Competence Hypothesis
28.04. APP1 (Fai Greeve)
05.05. PSY1 (Christian Meyer)
12.05. PSY2 (Melanie Reiplinger)
19.05. PSY3 (Fatemeh T. Asr)
26.05. NLP1 (Carolyn Ladda)
Saturday,
May 28th
2pm-6pm
PSY4 (Gerald Schoch)
NLP3a (Miriam Käshammer)
NLP3b (Auwn Muhammad)
02.06. NO MEETING (public holiday "Himmelfahrt")
09.06. NLP2 (Michael Fell)
16.06. NLP4 (Nikolina Koleva)
23.06. NO MEETING (public holiday "Fronleichnam")
30.06. APP3 (Benjamin Weitz) and How to write a seminar paper.
07.07. (slot moved to triple Saturday session earlier in the year.)
14.07. (slot moved to triple Saturday session earlier in the year.)