[up]Department of Mathematics

The 23rd Finnish Summer School on Probability Theory

The 23rd Finnish Summer School on Probability Theory will be arranged at the Mukkula manor in Lahti from June 4th to June 8th, 2001. The course program is mainly made up of four series of lectures, about 6 h each. A couple of lectures by some of the course attendants may also be held. The school, which is arranged by the Finnish Graduate School in Stochastics, is financed by the Academy of Finland through the Rolf Nevanlinna institute and there is no participation fee.

The summer school begins on Monday June 4th at 1 p.m. Most people arrive a little earlier, since lunch is served at 12 noon in the restaurant at the Mukkula manor (Mukkulan kartanohotelli, Ritaniemenkatu 10, 15240 Lahti). The school ends on Friday June 8th around 12 noon.

Accommodation can be reserved in the nearby summer hotel (Mukkulan kesähotelli). Half pension with accommodation in a double room costs 185 FIM per person and night, accommodation in a single room costs 255 FIM per person and night. The payment is made on location in Lahti. Please mention at the hotel reception if you wish to have a special diet or if you have other special wishes. Registrations are due by May 10th 2001.

Welcome to Lahti!

Göran Högnäs

Addresses:

  • Åbo Akademi University, Department of Mathematics, 20500 Åbo
  • E-post: adahlstr@abo.fi, ghognas@abo.fi
  • Telefax: +358 2 215 4865
  • Telefon: +358 2 215 4372 (Dahlström), +358 2 215 4224 (Högnäs)
  • Speakers: Robert J. Adler (Technion - Israel Institute of Technology), Stanislaw Kwapien (Warsaw), Ragnar Norberg (London School of Economics) and Christophe Croux (Brussels)

    Course description

    Robert J. Adler: RANDOM FIELDS AND THEIR GEOMETRY Random fields have found application in a wide variety of diverse areas, including astrophysics on a galactical scale, and brain imaging, on a much smaller scale. While the applications may be very different, the underlying theory is the same, and it is the latter that I plan to describe in these lectures.

    1: THE BASICS OF GAUSSIAN RANDOM FIELDS: One of the advantages that Gaussian processes have over their Markov counterparts is that the ``general theory'' is almost independent of the structure of the parameter space. I shall show why this is the case, by first introducing this family of stochastic process, and then looking at the issue of sample path regularity from the viewpoint of ``metric entropy''. In doing so we shall be able to treat processes defined on the line, random fields defined in Euclidean space, and even set and function indexed processes all the same way.

    2-3: MAXIMA OF GAUSSIAN FIELDS: From the point of view of applications, one of the most important issues in the study of any class of real valued stochastic processes is the precise determination of the exceedence probabilities P\{sup_{t\in T} X_t > u\}, where T is a parameter set and u is some level. These lectures will concentrate on a number of ways of attacking this problem for Gaussian fields, as well as looking at the structure of Gaussian fields at high levels.

    4-5: RANDOM GEOMETRY AND EULER CHARACTERISTICS: These lecture will centre around the ``excursion sets'' of Gaussian fields; viz. the sets \{t\in T: X_t > u\} for some u. I shall discuss their geometric structure, and some of the ways this can be quantified, when T is a simple Euclidean set or a abstract manifold.

    6: BACK TO MAXIMA: In the final lecture, we shall see that probabilistic considerations make the ``Euler characteristic approach'' which I shall describe in detail) not only the ``right way'' to quantify excursion sets but also why computations here go a long way to solving the problems of Lectures 2 and 3.

    Much of what I will have to say can be found in my review paper in Ann Applied Prob, 2000, 1-74.

    Christophe Croux: Robust multivariate methods Robust methods are supposed to give reliable results under deviations from the model assumptions, like the presence of outliers. While there exists a large literature on robust methods for location-scale models and regression models, much less is known about possible robustifications of multivariate statistical methods. In the following talks we will propose and review different approaches to robust multivariate analysis.

    * introduction to robustness concepts:
    Here we will speak about different measures of robustness: influence function, gross-error sensitivity, breakdown point, maxbias curve. We will illustrate these concepts for some scale and regression estimators.

    * robust estimation of the covariance matrix:
    The covariance matrix is a key tool in multivariate statistics. Several proposals have been made to robustify them. We will give an overview and discuss their properties. Focus is on the Minimum Covariance Determinant estimator, S-estimators, and Sign/Rank based estimators. An application in multivariate regression analysis will be given.

    * robust principal components
    A robust principal component analysis (PCA) can be easily performed by computing the eigenvalues and eigenvectors of a robust estimator of the covariance or correlation matrix of the data. Many simulation studies have been carried out to find out which robust estimator should be used. Another approach to robustify PCA, based on projection pursuit (PP), has been proposed by Li and Chen (1985). The PP-index which needs to be maximized is a given dispersion measure. In this talk the different approaches to robust PCA will be compared.

    * robust canonical correlations
    Canonical correlation analysis (CCA) studies the associations between two sets of variables. The aim is to identify and quantify the relations by maximizing the correlation between linear combinations of the variables in one set with those in the other set. Here we introduce, discuss and compare different ways for robustifying CCA: robust estimation of the involved covariance matrices, CCA using the spatial signs of the observations, CCA based on projection pursuit, and CCA by performing robust alternating regression.

    * robust discriminant analysis
    In this talk we consider a robust linear discriminant function based on high breakdown location and covariance matrix estimators. We derive influence functions for the estimators of the parameters of the discriminant function and for the associated classification error. Within the class of multivariate S-estimators the most B-robust estimator, which minimizes the maximal influence that an outlier can have on the classification error, is obtained.

    * robust factor analysis
    In this paper a non-standard approach to robust factor analysis is presented, which uses an interlocking regression algorithm. The approach is highly robust, and also works well when there are more variables than observations. The technique yields a robust biplot, depicting the interaction structure between individuals and variables. The approach is illustrated by real and artificial examples and compared with factor analysis based on robust covariance matrix estimators. The same estimation technique can fit models with both additive and multiplicative effects to two-way tables, thereby extending the median polish technique.

    Participants

    Registration form

    The 19th Finnish Summer School on Probability Theory, held on June 2 to June 6, 1997

    The 20th Finnish Summer School on Probability Theory, held on June 1 to June 5, 1998

    The 21th Finnish Summer School on Probability Theory, held on May 31 to June 4, 1999

    The 22nd Finnish Summer School on Probability Theory, held on June 5 to June 9, 2000

    Express Bus timetables

    Train (Helsinki-Kouvola-Kotka)

    Summerhotel Mukkula

    Mukkula's Mansion

    Mukkula Map

    City of Lahti

    City Map
    Leisure Time Activities (in Finnish)
    Happenings (in Finnish)
    Updated 4.6.01 by mateweb@abo.fi