Research Seminar

# Mathematics of Computation

## Upcoming talks

Thursday, 07.12.2107 13:00 (s.t), We6 6.020 Kristin Kirchner (Chalmers)

## Variational Methods for Moments of Solutions to Stochastic Differential Equations

We consider parabolic stochastic partial differential equations with multiplicative Wiener noise. For the second moment of the mild solution, a deterministic space-time variational problem is derived. Well-posedness is proven on projective and injective tensor product spaces as trial and test spaces. From these results, a deterministic equation for the covariance function is deduced. These deterministic equations in variational form are used to derive numerical methods for approximating the second moment of solutions to stochastic ordinary and partial differential equations. For the canonical example of a stochastic ordinary differential equation with multiplicative noise, the geometric Brownian motion, we introduce and analyze Petrov-Galerkin discretizations based on tensor product piecewise polynomials. For approximating the second moment of solutions to stochastic partial differential equations, we then propose conforming space-time Petrov-Galerkin discretizations. In both cases, we derive stability bounds in the natural tensor product spaces. Numerical experiments validate the theoretical results.

Thursday, 14.12.2107 10:15 (s.t), We6 6.020 Peter Oswald (Bonn) (Joint work with M. Griebel)

## Schwarz iterative methods and RKHS theory

Schwarz iterative methods in Hilbert spaces are a recurring theme in my collaboration with M. Griebel. Lately, we became interested in greedy and stochastic versions of this class of iterative methods, where a variational problem in an infinite-dimensional Hilbert space $H$ is decomposed into infinitely many auxiliary subproblems. Convergence of a particular instance of the Schwarz iteration (called relaxed greedy and averaged stochastic descent, respectively) can be proved, algebraic rates of convergence can be obtained for large subsets of $H$ (this is in contrast to finite decompositions, where rates are exponential in the number of iteration steps). When working on the stochastic versions, we realized a certain connection to the theory of reproducing kernel Hilbert spaces and online learning algorithms. It turns out that the iterative solution of variational problems in a Hilbert space via subproblem solves based on space splittings can always be viewed as online learning of a variable Hilbert space valued function using kernel methods. Even though this connection does not help perfoming the algorithm, it sheds new light on the convergence proofs. The introduction of RKHS of variable Hilbert space valued functions may have some value for statistical learning as well, this hope is, however, not yet substantiated.