Institut de Mathématiques de Toulouse

Accueil > Événements Scientifiques > Séminaires & Groupes de Travail > Séminaires > Séminaire de Statistique

Séminaire de Statistique

par Agnès Lagnoux - publié le , mis à jour le

Organisateurs : Mélisande Albert, Dominique Bontemps, Pierre Neuvial

Jour et lieu habituels : le mardi à 11h00 en salle 106 (bâtiment 1R1).

  • Mardi 23 octobre 11:00-12:00 - Alix Rigal - Centre National de Recherches Météorologiques

    Evolution of daily Temperature due to global change

    Résumé : We consider the problem of estimating non-stationary temperature normals (the mean), at a daily timescale, which take into account deformations due to the global change of the annual cycle.
    In a second step, we consider not only the mean, but the whole distribution. Thus, we will present our methodological choices in a quantile regression framework to acquire the evolution of centiles throughout the 21st century.
    Although this technique provides a very convenient and powerful tool for studying changes in the distribution of a climate variable, this point of view raises issues that need to be addressed, such as quantile crossing.

    Lieu : Salle 106, Bat 1R1

  • Mardi 6 novembre 11:00-12:00 - Peter D. Grünwald - CWI, Amsterdam

    A Tight Excess Risk Bound via a Unified PAC-Bayesian-Rademacher-Shtarkov-MDL Complexity

    Résumé : We present a novel notion of complexity that interpolates between and generalizes some classic existing complexity notions in learning theory : for estimators like empirical risk minimization (ERM) with arbitrary bounded losses, it is upper bounded in terms of data-independent Rademacher complexity ; for generalized Bayesian estimators, it is upper bounded by the data-dependent information complexity (also known as stochastic or PAC-Bayesian, KL(posterior ∥ prior) complexity. For (penalized) ERM, the new complexity reduces to (generalized) normalized maximum likelihood (NML) complexity, i.e. a minimax log-loss individual-sequence regret. Our first main result bounds excess risk in terms of the new complexity. Our second main result links the new complexity via Rademacher complexity to L2(P) entropy, thereby generalizing earlier results of Opper, Haussler, Lugosi, and Cesa-Bianchi who did the log-loss case with L∞. Together, these results recover optimal bounds for VC- and large (polynomial entropy) classes, replacing localized Rademacher complexity by a simpler analysis which almost completely separates the two aspects that determine the achievable rates : ’easiness’ (Bernstein) conditions and model complexity.

  • Mardi 20 novembre 11:00-12:00 - Franziska Göbel - University of Potsdam

    On Graph Wavelets

    Résumé : In this talk I will present a multiscale approach to construct a data-adapted basis-like (Parseval frame) set of functions F which allows for a decomposition of every square-integrable function defined on the vertices of a finite undirected weighted graph. To this end, we follow the idea of Coulhon et al. (2012) for constructing localized frames on relatively general spaces.
    We have a look at the properties of F and at its application in the denoising setup which Is based on the property of being a Parseval frame. Given noisy values of an unknown function y_i=f(x_i)+epsilon_i at a finite number of points x_i we want to recover f at these points. Based on a neighborhood graph representation of the points x_i we derive an estimate of f using the frame decomposition and a thresholding method to the coefficients.
    If time allows we furthermore show that the considered random neighborhood graphs satisfy with high probability a doubling volume condition as well as a local Poincaré inequality under some assumptions on the underlying space and the sampling. These two properties are essentially for the spatial localization of the frame elements in the setting of Coulhon et al. (2012).
    This talk is based on joint work with Gilles Blanchard and Ulrike von Luxburg.

  • Mardi 27 novembre 11:00-12:00 - Carin Ludeña - Universidad Central de Venezuela

    Modular decomposition and random graph models

    Résumé : Decompositions of finite structures have been studied in different branches of discrete mathematics for over 40 years. In 1967, T. Gallai gave a prime decomposition theorem for simple graphs which has given rise to a robust literature on the subject, namely related to characterization of certain graph properties and optimization algorithms over graphs. However, there has apparently not been much connection between graph decompositions and applications in probability and statistics. In this talk we will give a brief overview of the general theory, provide some interesting insight on what graph decomposition looks like for common random graph models and discuss a graph model based on this notion.

  • Mardi 4 décembre 11:00-12:00 - Mathieu Ribatet - Université de Montpellier

    Autour de la structure des processus max-stables

    Résumé : Les processus max-stables jouent un rôle fondamental dans la modélisation spatiale des événements rares, e.g., inondations, vagues de chaleur… Dans cet exposé nous allons repartir de zéro en nous intéressant à leurs représentations spectrales ; représentation qui n’est rien de plus qu’une construction probabiliste simple de cette classe de processus. Par la suite, nous nous intéresserons à la structure particulière induite par cette représentation spectrale, ce qui nous permettra de parler de la difficulté de simuler (conditionnellement) ces processus, d’introduire des mesures (spatiales) de dépendance adaptées mais aussi d’inférence.