Poster
Differentiable Integer Linear Programming
Zijie Geng · Jie Wang · Xijun Li · Fangzhou Zhu · Jianye HAO · Bin Li · Feng Wu
Hall 3 + Hall 2B #569
[
Abstract
]
Wed 23 Apr 7 p.m. PDT
— 9:30 p.m. PDT
Abstract:
Machine learning (ML) techniques have shown great potential in generating high-quality solutions for integer linear programs (ILPs).However, existing methods typically rely on a *supervised learning* paradigm, leading to (1) *expensive training cost* due to repeated invocations of traditional solvers to generate training labels, and (2) *plausible yet infeasible solutions* due to the misalignment between the training objective (minimizing prediction loss) and the inference objective (generating high-quality solutions).To tackle this challenge, we propose **DiffILO** (**Diff**erentiable **I**nteger **L**inear Programming **O**ptimization), an *unsupervised learning paradigm for learning to solve ILPs*.Specifically, through a novel probabilistic modeling, DiffILO reformulates ILPs---discrete and constrained optimization problems---into continuous, differentiable (almost everywhere), and unconstrained optimization problems.This reformulation enables DiffILO to simultaneously solve ILPs and train the model via straightforward gradient descent, providing two major advantages.First, it significantly reduces the training cost, as the training process does not need the aid of traditional solvers at all.Second, it facilitates the generation of feasible and high-quality solutions, as the model *learns to solve ILPs* in an end-to-end manner, thus aligning the training and inference objectives.Experiments on commonly used ILP datasets demonstrate that DiffILO not only achieves an average training speedup of 13.2 times compared to supervised methods, but also outperforms them by generating heuristic solutions with significantly higher feasibility ratios and much better solution qualities.
Live content is unavailable. Log in and register to view live content