The Calculus of Deep Learning

This survey investigates the potential of differential equations to provide a foundational understanding of deep neural networks (DNNs), exploring how these equations can both illuminate DNN architectures and enhance their performance through analysis at both the network ([latex] \text{model level} [/latex]) and individual layer ([latex] \text{layer level} [/latex]) levels, with a focus on identifying practical applications benefiting from this grounding in mathematical principles.

A new perspective is emerging that frames neural networks not as discrete computational graphs, but as continuous dynamical systems described by differential equations.