Sign up to our mailing list by sending a blank email to

David Duvenaud - Neural Ordinary Differential Equations

Date Icon Week 2, Tuesday 7 May TT 2019
Time Icon 8:00pm

Neural Ordinary Differential Equations

“Neural Ordinary Differential Equations”

We introduce a new family of deep neural network models. Instead of specifying a discrete sequence of hidden layers, we parameterize the derivative of the hidden state using a neural network. The output of the network is computed using a black-box differential equation solver. These continuous-depth models have constant memory cost, adapt their evaluation strategy to each input, and can explicitly trade numerical precision for speed. We demonstrate these properties in continuous-depth residual networks and continuous-time latent variable models. We also construct continuous normalizing flows, a generative model that can train by maximum likelihood, without partitioning or ordering the data dimensions.

This talk will also discuss in-progress follow-up work on invertible density models, stochastic differential equations, and regularizing the dynamics to be cheap to evaluate.

Snacks and drinks provided as usual.

This event is free for members and £3 for non-members. A life membership can be purchased on the door for £15.