Skip to yearly menu bar Skip to main content


Poster

Latent Bayesian Optimization via Autoregressive Normalizing Flows

Seunghun Lee · Jinyoung Park · Jaewon Chu · Minseo Yoon · Hyunwoo Kim

Hall 3 + Hall 2B #375
[ ] [ Project Page ]
Fri 25 Apr midnight PDT — 2:30 a.m. PDT
 
Oral presentation: Oral Session 3B
Thu 24 Apr 7:30 p.m. PDT — 9 p.m. PDT

Abstract:

Bayesian Optimization (BO) has been recognized for its effectiveness in optimizing expensive and complex objective functions.Recent advancements in Latent Bayesian Optimization (LBO) have shown promise by integrating generative models such as variational autoencoders (VAEs) to manage the complexity of high-dimensional and structured data spaces.However, existing LBO approaches often suffer from the value discrepancy problem, which arises from the reconstruction gap between input and latent spaces.This value discrepancy problem propagates errors throughout the optimization process, leading to suboptimal outcomes.To address this issue, we propose a Normalizing Flow-based Bayesian Optimization (NF-BO), which utilizes normalizing flow as a generative model to establish one-to-one function between input and latent spaces, eliminating the reconstruction gap.Specifically, we introduce SeqFlow, an autoregressive normalizing flow for sequence data.In addition, we develop a new candidate sampling strategy that dynamically adjusts the exploration probability for each token based on its importance.Through extensive experiments, our NF-BO method demonstrates superior performance in molecule generation tasks, significantly outperforming both traditional and recent LBO approaches.

Live content is unavailable. Log in and register to view live content