Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Deep Generative Models for Highly Structured Data

Oracle Guided Image Synthesis with Relative Queries

Alec Helbling · Christopher Rozell · Matthew O'Shaughnessy · Kion Fallah


Abstract: Isolating and controlling specific features in the outputs of generative models in a user-friendly way is a difficult and open-ended problem. We develop techniques that allow a user to generate an image they are envisioning in their head by answering a sequence of relative queries of the form \textit{``do you prefer image $a$ or image $b$?''} Our framework consists of a Conditional VAE that uses the collected relative queries to partition the latent space into preference-relevant features and non-preference-relevant features. We then use the user's responses to relative queries to determine the preference-relevant features that correspond to their envisioned output image. Additionally, we develop techniques for modeling the uncertainty in images' predicted preference-relevant features, allowing our framework to generalize to scenarios in which the relative query training set contains noise.

Chat is not available.