Main Parts of the CourseThe most central fact about natural language is that it has meaning. We use sentences to say something about how the world is (or could be, or would be...), and we can do this because sentences have meaning. In semantics, we're concerned with the study of meaning. In formal semantics, we conduct this study in a formal manner. In computational semantics, we're additionally interested in using the results of our study when we implement programs that process natural language. We want to make such programs "meaning aware" , that is they should be able to deal with language
and its meaning. The main part of this course is divided into two blocks, each concerned with one central question:
- Given a sentence, how do we get to its meaning? And how can we automate this process? This is, we're going to look at the task of semantic or ↗meaning construction .
- Given that we have the meaning of a sentence, what can we do with it? And again, how can we automate the process? This will lead us to the topic of ↗inference .
Having looked thoroughly at the phenomenon of sentence meaning, we widen our perspective: Sentences don't occur solitary. They're almost always combined in
↗discourses. Various interesting phenomena arise when one combines sentences in a discourse. The last chapter of this course gives an overview of
↗Discourse Representation Theory (DRT), a theory of meaning that allows to deal with many of these new phenomena. As an example we look at the phenomenon of anaphoric reference.