Skip to yearly menu bar Skip to main content


Poster

On the Joint Interaction of Models, Data, and Features

Yiding Jiang · Christina Baek · J Kolter

Halle B #258
[ ]
Fri 10 May 1:45 a.m. PDT — 3:45 a.m. PDT
 
Oral presentation: Oral 7C
Fri 10 May 1 a.m. PDT — 1:45 a.m. PDT

Abstract:

Learning features from data is one of the defining characteristics of deep learning,but the theoretical understanding of the role features play in deep learning is still inearly development. To address this gap, we introduce a new tool, the interactiontensor, for empirically analyzing the interaction between data and model throughfeatures. With the interaction tensor, we make several key observations abouthow features are distributed in data and how models with different random seedslearn different features. Based on these observations, we propose a conceptualframework for feature learning. Under this framework, the expected accuracy for asingle hypothesis and agreement for a pair of hypotheses can both be derived inclosed form. We demonstrate that the proposed framework can explain empiricallyobserved phenomena, including the recently discovered Generalization Disagreement Equality (GDE) that allows for estimating the generalization error with onlyunlabeled data. Further, our theory also provides explicit construction of naturaldata distributions that break the GDE. Thus, we believe this work provides valuablenew insight into our understanding of feature learning.

Chat is not available.