Supermix : sparse regularization for mixtures.

Abstract

This paper investigates the statistical estimation of a discrete mixing measure μ0 involved in a kernel mixture model. Using some recent advances in ℓ1-regularization over the space of measures, we introduce a “data fitting and regularization” convex program for estimating μ0 in a grid-less manner from a sample of mixture law, this method is referred to as Beurling-LASSO.
Our contribution is two-fold: we derive a lower bound on the bandwidth of our data fitting term depending only on the support of μ0 and its so-called “minimum separation” to ensure quantitative support localization error bounds; and under a so-called “nondegenerate source condition” we derive a nonasymptotic support stability property. This latter shows that for a sufficiently large sample size n, our estimator has exactly as many weighted Dirac masses as the target μ0, converging in amplitude and localization towards the true ones. Finally, we also introduce some tractable algorithms for solving this convex program based on “Sliding Frank–Wolfe“ or “Conic Particle Gradient Descent“.
Statistical performances of this estimator are investigated designing a so-called “dual certificate”, which is appropriate to our setting. Some classical situations as, for example, mixtures of super-smooth distributions (see, e.g., Gaussian distributions) or ordinary-smooth distributions (see, e.g., Laplace distributions), are discussed at the end of the paper.

Publication
Annals of Statistics, 49(3): 1779-1809
Cathy MAUGIS-RABUSSEAU
Cathy MAUGIS-RABUSSEAU
Associate Professor