News and Updates

  • Five papers presented at NeurIPS 2021 December 5, 2021

    My group is presenting five papers at the main conference and one workshop paper at NeurIPS this year.

  • Keynote at the NeurIPS Meetup Copenhagen December 5, 2021

    After a two-year break in travelling (since NeurIPS 2019), I’m giving a live keynote at the official NeurIPS satellite event in Copenhagen.

  • Appointed as ELLIS Scholar October 25, 2021

    I have been appointed as an ELLIS Scholar in Theory, Algorithms and Computations of Modern Learning Systems under European Laboratory for Learning and Intelligent Systems (ELLIS).

  • Academy of Finland Research Fellow September 1, 2021

    I have received five-year personal funding from the Academy of Finland to support my research.

  • Member of the Young Academy Finland September 1, 2021

    I have been appointed as member of the Young Academy Finland for the term 2021–2025.

Research Highlights

Illustration from Applied Stochastic Differential Equations

Simo SärkkäArno Solin

Applied Stochastic Differential Equations
Cambridge University Press 2019 Abstract: Stochastic differential equations are differential equations whose solutions are stochastic processes. They exhibit appealing mathematical properties that are useful in modeling uncertainties and noisy phenomena in many disciplines. This book is motivated by applications of stochastic differential equations in target tracking and medical technology and, in particular, their use in methodologies such as filtering, smoothing, parameter estimation, and machine learning. It builds an intuitive hands-on understanding of what stochastic differential equations are...

Illustration from Hilbert space methods for reduced-rank Gaussian process regression

Arno SolinSimo Särkkä

Hilbert space methods for reduced-rank Gaussian process regression
Statistics and Computing 2020 Abstract: This paper proposes a novel scheme for reduced-rank Gaussian process regression. The method is based on an approximate series expansion of the covariance function in terms of an eigenfunction expansion of the Laplace operator in a compact subset of R^d. On this approximate eigenbasis the eigenvalues of the covariance function can be expressed as simple functions of the spectral density of the Gaussian process, which allows the GP inference to be solved...

Illustration from Generative modelling with inverse heat dissipation

Severi RissanenMarkus HeinonenArno Solin

Generative modelling with inverse heat dissipation
International Conference on Learning Representations (ICLR) 2023 Abstract: While diffusion models have shown great success in image generation, their noise-inverting generative process does not explicitly consider the structure of images, such as their inherent multi-scale nature. Inspired by diffusion models and the empirical success of coarse-to-fine modelling, we propose a new diffusion-like model that generates images through stochastically reversing the heat equation, a PDE that locally erases fine-scale information when run over the 2D plane of the image. We interpret...

Illustration from Stationary activations for uncertainty calibration in deep learning

Lassi MeronenChristabella IrwantoArno Solin

Stationary activations for uncertainty calibration in deep learning
Advances in Neural Information Processing Systems 33 (NeurIPS) 2020 Abstract: We introduce a new family of non-linear neural network activation functions that mimic the properties induced by the widely-used Matérn family of kernels in Gaussian process (GP) models. This class spans a range of locally stationary models of various degrees of mean-square differentiability. We show an explicit link to the corresponding GP models in the case that the network consists of one infinitely wide hidden layer. In the limit of infinite smoothness...

Illustration from Know your boundaries: Constraining Gaussian processes by variational harmonic features

Arno SolinManon Kok

Know your boundaries: Constraining Gaussian processes by variational harmonic features
Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS) 2019 Abstract: Gaussian processes (GPs) provide a powerful framework for extrapolation, interpolation, and noise removal in regression and classification. This paper considers constraining GPs to arbitrarily-shaped domains with boundary conditions. We solve a Fourier-like generalised harmonic feature representation of the GP prior in the domain of interest, which both constrains the GP and attains a low-rank representation that is used for speeding up inference. The method scales as O(nm^2) in prediction and O(m^3) in...

Illustration from Infinite-horizon Gaussian processes

Arno SolinJames HensmanRichard E. Turner

Infinite-horizon Gaussian processes
Advances in Neural Information Processing Systems 31 (NeurIPS) 2018 Abstract: Gaussian processes provide a flexible framework for forecasting, removing noise, and interpreting long temporal datasets. State space modelling (Kalman filtering) enables these non-parametric models to be deployed on long datasets by reducing the complexity to linear in the number of data points. The complexity is still cubic in the state dimension m which is an impediment to practical application. In certain special cases (Gaussian likelihood, regular spacing) the GP posterior will reach...

See all publications...

Videos

In the News