Panelist: Yejin Choi Title Knowledge about the World Knowledge is a necessary condition to intelligent communication. The NLP field has made a significant progress in extracting encyclopedic knowledge from text (e.g., what year and which city Hillary Clinton has been born), operationalized as information extraction over entities and relations. However, the type of knowledge crucial but lacking in today’s AI systems is everyday functional knowledge (e.g., in order to put a trophy in a bag, the bag must be bigger than the trophy). It is this type of knowledge that seems crucial for AI systems to navigate through and reason about unstructured everyday situations. However, relatively less research efforts have been made in this direction perhaps because the challenges seem too hard and the problems ill-defined. In this talk, I will share some of our recent experiences in attempting to make progress in background knowledge acquisition with respect to limited domains. Unlike early AI research efforts that aimed to attain formal and symbolic representation of knowledge, our take is to let the model to learn knowledge directly from natural language text and images easily available at scale, and to design algorithms that can cope with noise as well as the report bias.