Prof. Gérard Biau

Chair: Silvia Salini

09:55 - 10:55, Aula 9


Deep residual networks and differential equations


Abstract

Deep learning has become a prominent approach for many applications, such as computer vision or neural language processing. However, the mathematical understanding of these methods is still incomplete. A recent approach is to consider neural networks as discretized versions of differential equations. I will first give an overview of this emerging field and then discuss new results on residual neural networks, which are state-of-the-art deep learning models.

 

A work by Gianluca Sottile

(on behalf of the local organizing committee)