Skip to yearly menu bar Skip to main content

Affinity Workshop: Tiny Papers Oral Session 2

DFWLayer: Differentiable Frank-Wolfe Optimization Layer

Zixuan Liu · Liu Liu · Xueqian Wang · Peilin Zhao


Differentiable optimization has received a significant amount of attention due to its foundational role in the domain of machine learning based on neural networks. This paper proposes a differentiable layer, named Differentiable Frank-Wolfe Layer (DFWLayer), by rolling out the Frank-Wolfe method, a well-known optimization algorithm which can solve constrained optimization problems without projections and Hessian matrix computations, thus leading to an efficient way of dealing with large-scale convex optimization problems with norm constraints. Experimental results demonstrate that the DFWLayer not only attains competitive accuracy in solutions and gradients but also consistently adheres to constraints.

Chat is not available.