Useful information

This page contains information that may be helpful in planning visit to Helsinki for the EMS 2017 conference. Topics include Lunch options, Transportation, Running, Tennis Courts,Table Tennis and Gym, Sauna and Swimming , and Childcare.


Helsinki Tourist Information website and Helsinki Tourist Information Office, Pohjoisesplanadi 19.

Helsinki This Week.

Lunch options

There are many restaurants and cafes in the city center where you can get a decent lunch for about 10€. Here a list of lunch places nearby the conference venue, just to cite a few:

Other options also for dinner

Transportation

Pathfinder applications: Public transport route suggestions between two given locations at a given time of day can be obtained at Journey planner or at Google Maps.

To go from the Töölö Towers University guest house to the EMS-conference venue in 15 minutes, take tram number 2 from the Apollonkatu stop in the Olympiaterminaali direction and get off in Senaatintori after 7 stops.

Bus number 615 operates between the Airport and the City center.

Tickets: Can be bought on buses and trams from the driver. They are valid for all means of public transportation (bus, tram, metro and regional trains). It is convenient to buy a day or week ticket from any R-kioski kiosk, or from the HSL (Helsinki Regional Transport Authority) service office at the Central Railway Station metro station. More infos can be found at www.hsl.fi/en/tickets-and-fares.

Taxi: Can be ordered e.g. by calling Taksi Helsinki +358 100 0700 or Taksi Kovanen +358 200 6060.

In Helsinki there is also a system of yellow public city bicycles with 1400 bikes and 140 bike stations where the registered user can take and return a bike for a fee. There are yellow bike stations near Töölö Towers and near the conference venue.

Running in Helsinki

If you are a dedicated runner, don't miss the Self-Transcendence 2 mile race in Munkkiniemi park (tram n.4 from the conference venue), on tuesday 25.7.2017 at 19.00, organized by the local Sri Chinmoy Marathon Team, the registration costs 4€.

Tennis courts

Taivallahti Tennis Center Hiekkarannantie 2 00100 Helsinki phone +35894770490.

Kaisaniemi Tennis Courts in Kaisaniemi park, phone +35893417130).

Table Tennis and Gym

Töölö sports hall Paavo Nurmen kuja 1c, entrance 3,50€

Child care

We would like to encourage parents to take part to the conference.

Finland is a very children-friendly country. Every space is accessible with a pram and every restaurant provides high chairs for customers and often babyfood or kids menus.

Public transportation: Children between 0 and 6 years travel for free on buses, trams, metro and regional trains. Moreover, a parent carrying a baby on a pram/stroller travels for free. For more information, visit www.hsl.fi/en/information/how-use-public-transport/board.

Taxis and passenger cars: By law, when travelling by taxi you do not need a car seat for your baby. However, when booking a taxi in advance, you can request a car seat and there may be one available. Children accessories are rented by Lastenturva.

Babysitting services: MLL, Väestöliitto, Stella, Kodinavux

Medical assistance: Mehiläinen, Pikkujätti, Diacor

For emergencies, recall that the general emergency assistance number in Finland is 112. You can also contact the Emergency Clinic.

There are many activities for children all year long:

Amusement parks: Murulandia, Snadistadi, Helsingin Leikkiluola, Linnanmäki, Serena waterpark

Activities: Kids fitness, Visit Helsinki - events for children, Visit Helsinki - activities for the family, Annantalo Art Centre for children, Libraries Network, Megazone laser game

Museums/tours: Ateneum Art Museum, Kiasma Modern Art Museum, Kaisaniemi Botanic Garden of Helsinki University, Kumpula Botanic Garden of Helsinki University, National natural history museum, Seurasaari open air museum, Sampo puppet theatre, Korkeasaari Zoo, Tropicario, Fallkulla domestic animals farm, Haltia Nature Centre, Sea Life Acquarium,

If you need help or assistance, feel free to contact the organisers.


Plenary Lectures

> <
  • Professor Mark Girolami , Imperial College London:

    Diffusions and dynamics on statistical manifolds for statistical inference.

    Abstract. The use of Differential Geometry in Statistical Science dates back to the early work of C.R.Rao in the 1940s when he sought to assess the natural distance between population distributions. The Fisher-Rao metric tensor defined the Riemannian manifold structure of probability measures and from this local manifold geodesic distances between probability measures could be properly defined. This early work was then taken up by many authors within the statistical sciences with an emphasis on the study of the efficiency of statistical estimators. The area of Information Geometry has developed substantially and has had major impact in areas of applied statistics such as Machine Learning and Statistical Signal Processing. A different perspective on the Riemannian structure of statistical manifolds can be taken to make breakthroughs in the contemporary statistical modelling problems. Langevin diffusions and Hamiltonian dynamics on the manifold of probability measures are defined to obtain Markov tran- sition kernels for Monte Carlo based inference.

  • Professor Gerda Claeskens ,KU Leuven:

    Effects of model selection and weight choice on inference.

    Abstract. Weights may be introduced in the estimation process in several ways: estimators may be weighted by zero/one weights in a model selection procedure such that only a ’selected’ estimator is kept for further consideration; weighted estimators may employ more general weights, which can be optimised in some fashion; or weights can be introduced during the estimation stage, resulting in so-called composite estimators which minimise a weighted loss function. Several such estimation strategies are discussed and compared. In general, the randomness of the weights makes inference challenging. For some special cases, including random 0/1 weights from selection by Akaike’s information criterion, it is possible to construct asymptotic confidence regions which are uniformly valid and which incorporate the selection uncertainty.

  • Professor Alexander Holevo , Steklov Mathematical Institute:

    Quantum Shannon Theory.

    Abstract. The notions of channel and information capacity are central to the classical Shannon theory. Quantum Shannon theory is a mathematical discipline which uses operator and matrix analysis and various asymptotic techniques to study the laws for information processing in the systems obeying rules of quantum physics. From the mathematical point of view quantum channels are normalized completely positive maps of operator algebras, the analog of Markov maps in the noncommutative probability theory, playing a role of morphisms in the category of quantum systems. This talk presents basic coding theorems providing analytical expressions for the capacities of quantum channels in terms of various entropic quantities. The remarkable role of specific quantum correlations entanglement as a novel communication resource, is stressed. We report on solution of exciting mathematical problems, such as ”Gaussian optimizers”, concerning computation of the entropic quantities for both theoretically and practically important class of Bosonic Gaussian channels.

  • Professor Yann LeCun , Facebook AI Research & New York University:

    Deep learning: A statistical puzzle.

    Abstract. Deep learning is at the root of revolutionary progress in visual and auditory perception by computers, and is pushing the state of the art in natural language understanding, dialog systems and language translation. Deep learning systems are deployed everywhere from self-driving cars to social networks content filtering to search engines ranking and medical image analysis. A deep learning system is typically an ”almost” differentiable function, composed of multiple highly non- linear steps, parametrized by a numerical vector with 10 7 to 10 9 dimensions, and whose evaluation of one sample requires 10 9 to 10 1 0 numerical operations. Training such a system consists in opti- mizing a highly non-convex objective averaged over millions of training samples using a stochastic gradient optimization procedure. How can that possibly work? The fact that it does work very well is one of the theoretical puzzles of deep learning.

  • Professor Martin Wainwright , University of University of California at Berkeley:

    Pairwise ranking and crowd-sourcing: Statistical models and computational challenges (with Nihar Shah, Sivaraman Balakrishnan and Aditya Guntuboyina).

    Abstract. Many modern data sets take the form of pairwise comparisons, in which binary judgements are made about pairs of items. Some examples include the outcomes of matches between tennis players, ratings of the relevance of search queries, and the outputs of crowd-sourcing engines. We discuss some statistical models for modeling data of this type, along with the computational challenges that arise in performing estimation and rank aggregation with such models.

  • Professor Alison Etheridge, University of Oxford:

    Modelling evolution in a spatial continuum

    Abstract. Since the pioneering work of Fisher, Haldane and Wright at the beginning of the 20th Century, mathematics has played a central role in theoretical population genetics. In turn, population genetics has provided the motivation both for important classes of probabilistic models, such as coalescent processes, and for deterministic models, such as the celebrated Fisher-KPP equation. Whereas coalescent models capture ’relatedness’ between genes, the Fisher-KPP equation captures something of the interaction between natural selection and spatial structure. What has proved to be remarkably difficult is to combine the two, at least in the biologically relevant setting of a two-dimensional spatial continuum. In this talk we describe some of the challenges of modelling evolution in a spatial continuum, present a model that addresses those challenges, and, as time permits, describe some applications.

  • Professor Hannu Oja , University of Turku:

    Scatter matrices and linear dimension reduction (with Klaus Nordhausen , David E. Tyler and Joni Virta)

    Abstract. Most linear dimension reduction methods proposed in the literature can be formulated using a relevant pair of scatter matrices, see e.g. Tyler et al. (2009), Bura and Yang (2011) and Liski et al. (2014). The eigenvalues and eigenvectors of one scatter matrix with respect to another one can be used to determine the dimension of the signal subspace as well as the projection to this subspace. In this talk, three classical dimension reduction methods, namely principal component analysis (PCA), fourth order blind identification (FOBI) and sliced inverse regression (SIR) are considered in detail. The first two moments of subsets of the eigenvalues are used to test for the dimension of the signal space. The limiting null distributions of the test statistics are given and bootstrap strategies are suggested for small sample sizes. The theory is illustrated with simulations and real data examples. The talk is in part based on Nordhausen et al. (2017)

  • Professor John Aston , University of Cambridge:

    Functional object data analysis

    Abstract. Functional Data Analysis has grown into a mature area of statistics over the last 20 years or so, but it is still predominantly based on the notion that the data are one dimensional i.i.d. curves belonging to some smooth Euclidean-like space. However, there have been many recent examples arising from the explosion of data being recorded in science and medicine that do not conform to these notions. Functional Object Data Analysis looks at the situation where the objects are functional-like, in that they are well represented in infinite dimensional spaces, but where there are other considerations such as geometry or higher dimensionality. We will examine cases where the data is multidimensional, where it no longer lives in a Euclidean space and where the objects are related such as in space or time. Including the data’s intrinsic constraints can profoundly enhance the analysis. Examples from Linguistics, Image Analysis and Forensics will help illustrate the ideas.