Deep Learning For Symbolic Mathematics

Guillaume Lample, François Charton

Keywords: transformer

Wednesday: Symbols and Discovery

Abstract: Neural networks have a reputation for being better at solving statistical or approximate problems than at performing calculations or working with symbolic data. In this paper, we show that they can be surprisingly good at more elaborated tasks in mathematics, such as symbolic integration and solving differential equations. We propose a syntax for representing these mathematical problems, and methods for generating large datasets that can be used to train sequence-to-sequence models. We achieve results that outperform commercial Computer Algebra Systems such as Matlab or Mathematica.

Similar Papers

Deep Symbolic Superoptimization Without Human Knowledge
Hui Shi, Yang Zhang, Xinyun Chen, Yuandong Tian, Jishen Zhao,
Depth-Adaptive Transformer
Maha Elbayad, Jiatao Gu, Edouard Grave, Michael Auli,