Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Neurosymbolic Generative Models (NeSy-GeMs)

[Remote poster] Generating Temporal Logical Formulas with Transformer GANs

Jens U Kreber · Christopher Hahn


Abstract:

Training neural networks requires large amounts of training data, often not readily available in symbolic reasoning domains. In this extended abstract, we consider the scarcity of training data for temporal logics. We summarize a recently performed study on the capabilities of GANs and Wasserstein GANs equipped with Transformer encoders to generate sensible and challenging formulas in the prototypical temporal logic LTL. The approach produces novel and unique formula instances without the need for autoregression. The generated data can be used as substitute for real training data when training a classifier, and training data can be generated from a dataset that is too small to be trained on directly.

Chat is not available.