Machine Learning Research Journal
News
Density Estimation Using the Perceptron
We propose a new density estimation algorithm. Given $n$ i.i.d. observations from a distribution belonging to a class of densities on $\mathbb{R}^d$, our estimator outputs any density in the class who...
We propose a new density estimation algorithm. Given
$n$ i.i.d. observations from a distribution belonging to a class
of densities on $\mathbb{R}^d$, our estimator outputs any density in the class whose “perceptron
discrepancy” with the empirical distribution is at most $O(\sqrt{d/n})$.
The perceptron discrepancy is defined as the largest
difference in mass two distribution place on any halfspace. It is shown that
this estimator achieves the expected total variation distance to the truth that is almost
minimax optimal over the class of densities with bounded Sobolev norm and Gaussian
mixtures. This suggests that the regularity of the prior distribution could be an
explanation for the efficiency of the ubiquitous step in machine learning that replaces optimization over large function spaces with simpler parametric
classes (such as discriminators of GANs).
We also show that replacing the perceptron discrepancy with
the generalized energy distance of Székely and Rizzo (2013) further improves
total variation loss. The generalized energy distance between empirical
distributions is easily computable and
differentiable, which makes it especially useful for fitting generative models.
To the best of our knowledge, it is the first “simple” distance with such
properties that yields minimax optimal statistical guarantees.
In addition, we shed light on the ubiquitous method of representing discrete data in domain $[k]$ via embedding vectors on a unit ball in $\mathbb{R}^d$. We show that taking $d \asymp \log(k)$ allows one to use simple linear probing to evaluate and estimate total variation distance, as well as recovering minimax optimal sample complexity for the class of discrete distributions on $[k]$.
Source: JMLR
Word count: 1486 words
Published on 2025-01-01 08:00