Learning-Based Program Synthesis: Learning for Program Synthesis and Program Synthesis for Learning
My learning-based program synthesis research lies in two folds: (1) learning to synthesize programs from potentially ambiguous and complex specifications; and (2) neural-symbolic learning for language understanding. I will first talk about program synthesis applications, where my work demonstrates the applicability of learning-based program synthesizers for production usage. I will then present my work on neural-symbolic frameworks that integrate symbolic components into neural networks, which achieve better reasoning and generalization capabilities. In closing, I will discuss the challenges and opportunities of further improving the complexity and generalizability of learning-based program synthesis for future work.
Bio: Xinyun Chen is a Ph.D. candidate at UC Berkeley, working with Prof. Dawn Song. Her research lies at the intersection of deep learning, programming languages, and security. Her recent research focuses on learning-based program synthesis and adversarial machine learning. She received the Facebook Fellowship in 2020, and Rising Stars in Machine Learning in 2021. Her work SpreadsheetCoder for spreadsheet formula prediction was integrated into Google Sheets, and she was part of the AlphaCode team when she interned at DeepMind.
This talk will be recorded and live-streamed at https://mediacentrallive.princeton.edu/