Ph.D. Thesis

Title: Model selection in heteroscedastic regression [PDF]

Defended June 15 2009 at the Université Nice - Sophia Antipolis.

Advisor: Yannick Baraud

Committee:

Mr BARAUD Yannick Advisor
Mr BIRGE Lucien Chair of committee
Mrs COMTE Fabienne Main examiner
Mrs HUET Sylvie Examiner
Mr LOUBES Jean-Michel Main examiner
Mrs REYNAUD-BOURET Patricia Examiner

Abstract

This thesis takes place within the theories of nonasymptotic statistics and model selection. Its goal is to provide data-driven procedures to estimate some parameters in heteroscedastic regression. This framework is receiving a large interest in various domains of applied mathematics. Our procedures rely in particular on some concentration inequalities and their practical efficiency is assessed on simulated data.

The first part is devoted to simultaneous estimation of the mean and the variance of a Gaussian vector with independent coordinates. To this end, we introduce a model selection procedure based on some penalized likelihood criterion. We prove nonasymptotic results for this method, such as oracle type inequalities and uniform convergence rates over Hölderian balls.

We also consider the problem of estimation of the regression function in an heteroscedastic regression framework with known dependencies. Model selection procedures are constructed for Gaussian errors and under moment conditions. Nonasymptotic oracle type inequalities and adaptivity are proved for the estimators. In particular, we apply these procedures to estimate a component in an additive regression model.

Keywords

Nonasymptotic statistics, model selection, penalization, oracle inequality, nonparametric regression, heteroscedastic, additive model, adaptivity, minimax rate, Kullback risk.