Optimisation algorithms in Statistics II, 3.5 credits

Based on the topics discussed in first part of the course, we continue with deepening the theoretical basis for stochastic optimisation algorithms. Specifically, we discuss theory around Stochastic Gradient Ascent (including momentum and adaptive step sizes), Simulated Annealing, and Particle Swarm Optimisation. Theoretical results on convergence and speed will be discussed.

See http://gauss.stat.su.se/phd/oasi/optimisation2.html for more information.

Most welcome to the course!
Frank Miller, Department of Statistics, Stockholm University

Course Data
Type of schedule: 
Travel friendly schedule
University: 
Stockholm
Level: 
PhD
Credits (ECTS): 
3.50
Offered: 
2021:1