Imperial College London

Dr Nikolas Kantas

Faculty of Natural SciencesDepartment of Mathematics

Reader in Statistics
 
 
 
//

Contact

 

+44 (0)20 7594 2772n.kantas Website

 
 
//

Location

 

538Huxley BuildingSouth Kensington Campus

//

Summary

 

Publications

Citation

BibTex format

@inproceedings{Kantas:2019,
author = {Kantas, N and Parpas, P and Pavliotis, GA},
title = {The sharp, the flat and the shallow: Can weakly interacting agents learn to escape bad minima?},
url = {http://arxiv.org/abs/1905.04121v1},
year = {2019}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - An open problem in machine learning is whether flat minima generalize betterand how to compute such minima efficiently. This is a very challenging problem.As a first step towards understanding this question we formalize it as anoptimization problem with weakly interacting agents. We review appropriatebackground material from the theory of stochastic processes and provideinsights that are relevant to practitioners. We propose an algorithmicframework for an extended stochastic gradient Langevin dynamics and illustrateits potential. The paper is written as a tutorial, and presents an alternativeuse of multi-agent learning. Our primary focus is on the design of algorithmsfor machine learning applications; however the underlying mathematicalframework is suitable for the understanding of large scale systems of agentbased models that are popular in the social sciences, economics and finance.
AU - Kantas,N
AU - Parpas,P
AU - Pavliotis,GA
PY - 2019///
TI - The sharp, the flat and the shallow: Can weakly interacting agents learn to escape bad minima?
UR - http://arxiv.org/abs/1905.04121v1
UR - http://hdl.handle.net/10044/1/75863
ER -