[up]Department of Mathematics

The 25th Finnish Summer School on Probability Theory

The 25th Finnish Summer School on Probability Theory will be arranged at the Nagu Gammelgård conference center in Nagu/Nauvo from June 2nd to June 6th, 2003. The course program is mainly made up of four series of lectures, about 6 h each. A couple of lectures by some of the course attendants may also be held. The school is arranged by the Finnish Graduate School in Stochastics.

The venue is Nagu Gammelgård.

The summer school begins on Monday June 2nd at 12 noon with lunch. The school ends on Friday June 6th around 1 p.m. More details about the transportation will be given later.

The course fee (15 €), which includes course material, is to be paid on location to Margrét Halldórsdóttir. The rate for accomodation in a single room Monday to Friday is 208 € and a double room costs 180 € per person (lunches included). If you don't need accomodation, i.e. meals only, contact the reception on your arrival for necessary arrangements. The payment is made on location in Nagu. Please mention at the hotel reception if you wish to have a special diet or if you have other special wishes. Registrations are due by May 9th 2003.

Welcome to Nagu!

Göran Högnäs


  • Åbo Akademi University, Department of Mathematics, 20500 Åbo
  • E-post: mhalldor@abo.fi, ghognas@abo.fi
  • Telefax: +358 2 215 4865
  • Telefon: +358 2 215 4372 (Halldorsdottir), +358 2 215 4224 (Högnäs)
  • Speakers:

    Course descriptions:

    Gerold Alsmeyer The weighted branching process and its associated stochastic fixed point equation

    Slides from Alsmeyer's lectures are available here (pdf).

    Nizar Touzi Monte Carlo Methods for Stochastic Differential Equations
    These lectures aim to present the recent advances in Monte Carlo methods for stochastic processes based on the Malliavin calculus. We shall use the Malliavin integration by parts formula in order to
    - device a method for the simultaneous computation of expectations and their sensitivities with respect to the parameters,
    - derive an alternative representation of conditional expectations, and analyze its asymptotic performance in the context of simulation of backward stochastic differential equations.
    Lecture 1. Basics of Monte Carlo methods
    Lecture 2. Simulation of solutions of forward stochastic differential equations, and connection with the Dirichlet problem
    Lecture 3. Introduction to Malliavin calculus
    Lecture 4. Computation of greeks
    Lecture 5. Computation of regression functions
    Lecture 6. Simulation of Backward stochastic differential equations, and application to American options

    Keith Worsley (A) The Geometry of random images in astrophysics and brain mapping
    The geometry in the title is not the geometry of lines and angles but the geometry of topology, shape and knots. For example, galaxies are not distributed randomly in the universe, but they tend to form clusters, or sometimes strings, or even sheets of high galaxy density. How can this be handled statistically? The Euler characteristic (EC) of the set of high density regions has been used to measure the topology of such shapes; it counts the number of connected components of the set, minus the number of 'holes', plus the number of 'hollows'. Despite its complex definition, the exact expectation of the EC can be found for some simple models, so that observed EC can be compared with expected EC to check the model. A similar problem arises in functional magnetic resonance imaging (fMRI), where the EC is used to detect local increases in brain activity due to an external stimulus. Recent work has extended these ideas to manifolds so that we can detect changes in brain shape via structure masking, surface extraction, and 3D deformation fields. Finally we look at some curious random fields whose excursion sets are strings, and we show using the Siefert representation that these strings can be knotted

    (B) Detecting changes in brain shape, scale and connectivity via the geometry of random fields
    Three types of data are now available to test for changes in brain shape: 3D binary masks, 2D triangulated surfaces, and trivariate 3D vector displacement data from the non-linear deformations required to align the structure with an atlas standard. We used the Euler characteristic of the excursion set of a random field as a tool to test for localised shape changes. We extend these ideas to scale space, where the scale of the smoothing kernel is added as an extra dimension to the random field. Extending this further still, we look at fields of correlations between all pairs of voxels, which can be used to assess brain connectivity. Shape data is highly non-isotropic, that is, the effective smoothness is not constant across the image, so the usual random field theory does not apply. We propose a solution that warps the data to isotrophy using local multidimensional scaling. We then show that the subsequent corrections to the random field theory can be done without actually doing the warping - a result guaranteed in part by the famous Nash Embedding Theorem. This has recently been formalized by Jonathan Taylor who has extended Robert Adler's random field theory to arbitrary manifolds.

    (C) Recent advances in random field theory
    Since Robert Adler's 1981 book on the geometry of random fields, the many successful applications to astrophysics and brain mapping in the last 10 years have provoked a flurry of new theoretical work. We trace the history of this development over the last 20 years, touching on: Robert Adler's early work on the expected EC of excursion sets; David Siegmund's approach to finding the P-value of the maximum using Weyl's tube formula; Kuriki and Takemura's link between the two; Naiman and Wynn's improved Bonferroni inequalities; Robert Adler's proof that the expected EC really does approximate the P-value of the maximum; Jonathan Taylor's extensions to manifolds; the role of the Nash Embedding Theorem; and Jonathan Taylor's remarkable and unexpected Gaussian Kinematic Fundamental Formula for finding EC densities.

    Sergei Zuyev: Measures everywhere Importance of measures in Probability and Statistics is impossible to overestimate: the probability is a measure itself! Any statistical estimate is, in fact, a solution to an optimisation problem involving unknown underlying distribution from a given (parametric or not) class of measures. The course focuses on a new variational technique that allows for treatment of various optimisation problems involving measures in a general setting and in applications. Emphasis will be made on Poisson process case which shows a nice closed form expression for the gradients in various constrained optimisation problems. Among considered applications are approximation of functions, clustering, design of experiments, optimal placement of telecommunication stations, estimation of mixture distributions.

    Slides from Zuyev's lectures are available here (pdf).

    Here are some links related to the Summer School

    Updated 12.05.03 by mateweb@abo.fi