Panelist: Andy Kehler Title: Opportunities and Challenges for a Bayesian Approach to Language Processing Computational approaches to language understanding are typically reactive: language input triggers a search for an interpretation. Human language interpretation, on the other hand, is proactive: comprehenders use context to create 'top-down' expectations about the ensuing message and integrate them with the 'bottom-up' evidence provided by the speaker's utterance. Bayes' Rule naturally captures these two contributors to interpretation via its prior and likelihood terms respectively. I will discuss recent research (joint with Hannah Rohde) that reveals that human pronoun interpretation behavior not only follows Bayesian principles, but that much of the complexity that affects interpretation resides in contextual factors that condition the prior: that is, the part of the equation that is independent of the speaker's decision to use a pronoun. These results open up new possibilities for training systems, since large corpora of annotated pronouns should not be necessary to capture these complexities. At the same time, they also reveal the importance of overcoming significant uphill battles associated with predicting meaning that context implicitly conveys.