Neural Grammar Induction
Grammar induction is the task of inducing hierarchical syntactic structure from observed sentences alone. It is a longstanding problem in AI/NLP with potential scientific implications for understanding human language acquisition and engineering implications for improving machine learning systems. In this talk, I will discuss two recent works on unsupervised grammar induction with neural networks: (1) a method for learning a good generative model of language (i.e. language model) while at the same time inducing linguistically meaningful tree structures; (2) an approach to learning non-context free grammars by revisiting and extending the classical approach to grammar induction with probabilistic context-free grammars.
Yoon Kim is a fifth-year Ph.D. candidate in computer science at Harvard University. He is advised by Alexander Rush. He is supported by a Google Fellowship.