Skip to yearly menu bar Skip to main content


Poster

Rethinking Neural Multi-Objective Combinatorial Optimization via Neat Weight Embedding

Jinbiao Chen · Zhiguang Cao · Jiahai Wang · Yaoxin Wu · Hanzhang Qin · Zizhen Zhang · Yue-Jiao Gong

Hall 3 + Hall 2B #329
[ ]
Sat 26 Apr midnight PDT — 2:30 a.m. PDT

Abstract:

Recent decomposition-based neural multi-objective combinatorial optimization (MOCO) methods struggle to achieve desirable performance. Even equipped with complex learning techniques, they often suffer from significant optimality gaps in weight-specific subproblems. To address this challenge, we propose a neat weight embedding method to learn weight-specific representations, which captures weight-instance interaction for the subproblems and was overlooked by most current methods. We demonstrate the potentials of our method in two instantiations. First, we introduce a succinct addition model to learn weight-specific node embeddings, which surpassed most existing neural methods. Second, we design an enhanced conditional attention model to simultaneously learn the weight embedding and node embeddings, which yielded new state-of-the-art performance. Experimental results on classic MOCO problems verified the superiority of our method. Remarkably, our method also exhibits favorable generalization performance across problem sizes, even outperforming the neural method specialized for boosting size generalization.

Live content is unavailable. Log in and register to view live content