Deep Learning for Program Synthesis

[Research Statement] [Publications] [Members]

Research Statement

Synthesizing a program from a specification has been a long-standing challenge. Recent research have demonstrated that deep neural networks have the potential to learn a program satisfying various specification methods, such as natural language descriptions and input-output examples. The ability to automatically synthesize code has numerous applications, ranging from helping end-users (non-technical users) write programs, helping software developers synthesize mundane pieces of code or optimized code, helping data scientists clean up and explore data, to helping algorithm designers discover new algorithms.

This problem is extremely challenging, and the complexity of the synthesized programs by existing approaches is still limited. In our research for neural program synthesis, we aim at generating programs with more complexity, better generalizability, while guaranteeing the correctness. In this process, besides enabling real-world applications using program synthesis, we hope to make contributions towards addressing core challenges in deep learning including generalization, search, abstraction, and representation. We believe that solving the program synthesis problem is a good step towards solving AGI (artificial general intelligence).

Recent Publications

Compositional Generalization via Neural-Symbolic Stack Machines

Xinyun Chen, Chen Liang, Adams Wei Yu, Dawn Song, Denny Zhou.

Advances in Neural Information Processing Systems (NeurIPS). December, 2020.

 

Synthesize, Execute and Debug: Learning to Repair for Neural Program Synthesis

Kavi Gupta, Peter Ebert Christensen*, Xinyun Chen*, Dawn Song.

Advances in Neural Information Processing Systems (NeurIPS). December, 2020.

 

RAT-SQL: Relation-Aware Schema Encoding and Linking for Text-to-SQL Parsers

Bailin Wang*, Richard Shin*, Xiaodong Liu, Oleksandr Polozov, Matthew Richardson.

Annual Meeting of the Association for Computational Linguistics (ACL). July, 2020.

 

Neural Symbolic Reader: Scalable Integration of Distributed and Symbolic Representations for Reading Comprehension

Xinyun Chen, Chen Liang, Adams Wei Yu, Denny Zhou, Dawn Song, Quoc V. Le.

International Conference on Learning Representations (ICLR). May, 2020.

 

Deep Symbolic Superoptimization Without Human Knowledge

Hui Shi, Yang Zhang, Xinyun Chen, Yuandong Tian, Jishen Zhao.

International Conference on Learning Representations (ICLR). May, 2020.

 

Coda: An End-to-End Neural Program Decompiler

Cheng Fu, Huili Chen, Haolan Liu, Xinyun Chen, Yuandong Tian, Farinaz Koushanfar, Jishen Zhao.

Advances in Neural Information Processing Systems (NeurIPS). December, 2019.

 

Program Synthesis and Semantic Parsing with Learned Code Idioms

Richard Shin, Marc Brockschmidt, Militadis Allamanis, Oleksandr Polozov.

Advances in Neural Information Processing Systems (NeurIPS). December, 2019.

 

Execution-Guided Neural Program Synthesis

Xinyun Chen, Chang Liu, Dawn Song.

International Conference on Learning Representations (ICLR). May, 2019.

 

Synthetic Datasets for Neural Program Synthesis

Richard Shin, Neel Kant, Kavi Gupta, Chris Bender, Brandon Trabucco, Rishabh Singh, Dawn Song.

International Conference on Learning Representations (ICLR). May, 2019.

 

Improving Neural Program Synthesis with Inferred Execution Traces

Richard Shin, Illia Polosukhin, Dawn Song.

Advances in Neural Information Processing Systems (NIPS). December, 2018.

 

Tree-to-tree Neural Networks for Program Translation

Xinyun Chen, Chang Liu, Dawn Song.

Advances in Neural Information Processing Systems (NIPS). December, 2018.

 

Towards Synthesizing Complex Programs from Input-Output Examples

Xinyun Chen, Chang Liu, Dawn Song.

International Conference on Learning Representations (ICLR). May, 2018.

 

Parametrized Hierarchical Procedures for Neural Programming

Roy Fox, Richard Shin, Sanjay Krishnan, Ken Goldberg, Dawn Song, Ion Stoica.

International Conference on Learning Representations (ICLR). May, 2018.

 

SQLNet: Generating Structured Queries From Natural Language Without Reinforcement Learning

Xiaojun Xu, Chang Liu, Dawn Song.

November, 2017.

 

Making Neural Programming Architectures Generalize via Recursion

Jonathon Cai, Richard Shin, Dawn Song.

International Conference on Learning Representations (ICLR). April, 2017. Best Paper Award

 

Latent Attention For If-Then Program Synthesis

Xinyun Chen, Chang Liu, Richard Shin, Dawn Song, Mingcheng Chen.

Advances in Neural Information Processing Systems (NIPS). December, 2016.

 


Members