Quick links

Deep Semantics from Shallow Supervision

Date and Time
Wednesday, April 6, 2011 - 4:30pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
CS Department Colloquium Series
Speaker
Host
David Blei
What is the total population of the ten largest capitals in the US? Building a system to answer free-form questions such as this requires modeling the deep semantics of language. But to develop practical, scalable systems, we want to avoid the costly manual annotation of these deep semantic structures and instead learn from just surface-level supervision, e.g., question/answer pairs. To this end, we develop a new tree-based semantic representation which has favorable linguistic and computational properties, along with an algorithm that induces this hidden representation. Using our approach, we obtain significantly higher accuracy on the task of question answering compared to existing state-of-the-art methods, despite using less supervision.

Percy Liang obtained a B.S. (2004) and an M.S. (2005) from MIT and is now completing his Ph.D. at UC Berkeley with Michael Jordan and Dan Klein. The general theme of his research, which spans machine learning and natural language processing, is learning richly-structured statistical models from limited supervision. He has won a best student paper at the International Conference on Machine Learning in 2008, received the NSF, GAANN, and NDSEG fellowships, and is also a 2010 Siebel Scholar.

Follow us: Facebook Twitter Linkedin