Skip to yearly menu bar Skip to main content


Virtual presentation / top 25% paper

InCoder: A Generative Model for Code Infilling and Synthesis

Daniel Fried · Armen Aghajanyan · Jessy Lin · Sida Wang · Eric Wallace · Freda Shi · Ruiqi Zhong · Scott Yih · Luke Zettlemoyer · Mike Lewis

Keywords: [ program synthesis ] [ code generation ] [ language to code ] [ Applications ]


Abstract:

Code is seldom written in a single left-to-right pass and is instead repeatedly edited and refined. We introduce InCoder, a unified generative model that can perform program synthesis (via left-to-right generation) as well as editing (via masking and infilling). InCoder is trained to generate code files from a large corpus of permissively licensed code, where regions of code have been randomly masked and moved to the end of each file, allowing code infilling with bidirectional context. Our model is the first large generative code model that is able to infill arbitrary regions of code, which we evaluate in a zero-shot setting on challenging tasks such as type inference, comment generation, and variable re-naming. We find that the ability to condition on bidirectional context substantially improves performance on these tasks, while still performing comparably on standard program synthesis benchmarks in comparison to left-to-right only models pretrained at similar scale. Our models and code will be publicly released.

Chat is not available.