Poster
Bounds on LpLp Errors in Density Ratio Estimation via ff-Divergence Loss Functions
Yoshiaki Kitazawa
Hall 3 + Hall 2B #447
[
Abstract
]
Thu 24 Apr 7 p.m. PDT
— 9:30 p.m. PDT
Abstract:
Density ratio estimation (DRE) is a core technique in machine learning used to capture relationships between two probability distributions. ff-divergence loss functions, which are derived from variational representations of ff-divergence, have become a standard choice in DRE for achieving cutting-edge performance. This study provides novel theoretical insights into DRE by deriving upper and lower bounds on the LpLp errors through ff-divergence loss functions. These bounds apply to any estimator belonging to a class of Lipschitz continuous estimators, irrespective of the specific ff-divergence loss function employed.The derived bounds are expressed as a product involving the data dimensionality and the expected value of the density ratio raised to the pp-th power.Notably, the lower bound includes an exponential term that depends on the Kullback--Leibler (KL) divergence, revealing that the LpLp error increases significantly as the KL divergence grows when p>1p>1. This increase becomes even more pronounced as the value of pp grows. The theoretical insights are validated through numerical experiments.
Live content is unavailable. Log in and register to view live content