# CANCELLED - Static and dynamic problems on neural networks

# CANCELLED - Static and dynamic problems on neural networks

Neural networks are being applied to solve many real-world problems, but the theoretical reasons behind their practical success are not well understood. Key open problems can often be divided into "static" or "dynamic": static problems involve the classification of the critical points of the loss function, while dynamic problems depend on the basins of attraction for the optimization algorithm used to train the network (usually gradient descent). In this talk, I will argue that for both types of questions it can be useful to consider the functional space separately from the parameterization provided by the network's weights. For example, in the case of linear neural networks, the functional space can be identified with the algebraic variety of matrices with bounded rank. Using this perspective, many geometric aspects of the optimization can be investigated concretely. On the other hand, even in this simple setting, a complete understanding of the gradient dynamics remains an interesting (and open) mathematical problem.

*Matthew Trager is a Post-Doctoral Associate at the Center for Data Science at New York University, part of the Courant Institute of Mathematical Sciences. He has a Master’s degree in mathematics from the University of Pisa and Scuola Normale Superiore, and a "Master2" degree in Mathematics, Machine Learning and Computer Vision from École Normale Supérieure de Cachan. He completed his PhD in computer science at École Normale Supérieure of Paris, under the supervision of Jean Ponce and Martial Hebert. During his PhD, he worked on theoretical aspects of 3D vision. He is now interested in mathematical aspects of deep learning.*

* *