Deep Learning for Program Synthesis

[Research Statement] [Publications] [Members]

Research Statement

Synthesizing a program from a specification has been a long-standing challenge. Recent research have demonstrated that deep neural networks have the potential to learn a program satisfying various specification methods, such as natural language descriptions and input-output examples. The ability to automatically synthesize code has numerous applications, ranging from helping end-users (non-technical users) write programs, helping software developers synthesize mundane pieces of code or optimized code, helping data scientists clean up and explore data, to helping algorithm designers discover new algorithms.

This problem is extremely challenging, and the complexity of the synthesized programs by existing approaches is still limited. In our research for neural program synthesis, we aim at generating programs with more complexity, better generalizability, while guaranteeing the correctness. In this process, besides enabling real-world applications using program synthesis, we hope to make contributions towards addressing core challenges in deep learning including generalization, search, abstraction, and representation. We believe that solving the program synthesis problem is a good step towards solving AGI (artificial general intelligence).

Recent Publications

Tree-to-tree Neural Networks for Program Translation

Xinyun Chen, Chang Liu, Dawn Song.

Workshop of International Conference on Learning Representations (ICLR). May, 2018.


Towards Synthesizing Complex Programs from Input-Output Examples

Xinyun Chen, Chang Liu, Dawn Song.

International Conference on Learning Representations (ICLR). May, 2018.


Parametrized Hierarchical Procedures for Neural Programming

Roy Fox, Richard Shin, Sanjay Krishnan, Ken Goldberg, Dawn Song, Ion Stoica.

International Conference on Learning Representations (ICLR). May, 2018.


SQLNet: Generating Structured Queries From Natural Language Without Reinforcement Learning

Xiaojun Xu, Chang Liu, Dawn Song.

November, 2017.


Making Neural Programming Architectures Generalize via Recursion

Jonathon Cai, Richard Shin, Dawn Song.

International Conference on Learning Representations (ICLR). April, 2017. Best Paper Award


Latent Attention For If-Then Program Synthesis

Xinyun Chen, Chang Liu, Richard Shin, Dawn Song, Mingcheng Chen.

Advances in Neural Information Processing Systems (NIPS). December, 2016.