Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Improving Deep Regression with Tightness

Shihao Zhang · Yuguang Yan · Angela Yao

Hall 3 + Hall 2B #336
[ ] [ Project Page ]
Wed 23 Apr 7 p.m. PDT — 9:30 p.m. PDT

Abstract: For deep regression, preserving the ordinality of the targets with respect to the feature representation improves performance across various tasks. However, a theoretical explanation for the benefits of ordinality is still lacking. This work reveals that preserving ordinality reduces the conditional entropy H(Z|Y) of representation Z conditional on the target Y. However, our findings reveal that typical regression losses do little to reduce H(Z|Y), even though it is vital for generalization performance. With this motivation, we introduce an optimal transport-based regularizer to preserve the similarity relationships of targets in the feature space to reduce H(Z|Y). Additionally, we introduce a simple yet efficient strategy of duplicating the regressor targets, also with the aim of reducing H(Z|Y). Experiments on three real-world regression tasks verify the effectiveness of our strategies to improve deep regression. Code: https://github.com/needylove/Regression_tightness

Live content is unavailable. Log in and register to view live content