Abstract
A recent line of work regularises the dynamics of neural ordinary differential equations (neural ODEs), in order to reduce the number of function evaluations needed by a numerical ODE solver during training. For instance, in the context of continuous normalising flows, the Frobenius norm of Jacobian matrices are regularised under the hypothesis that complex dynamics relate to an ill-conditioned ODE and require more function evaluations from the solver. Regularising the Jacobian norm also relates to sensitivity analysis in the broader neural network literature, where it is believed that regularised models should be more robust to random and adversarial perturbations in their input. We investigate the conditioning of neural ODEs under different Jacobian regularisation strategies, in a binary classification setting. Regularising the Jacobian norm indeed reduces the number of function evaluations required, but at a cost to generalisation. Moreover, naively regularising the Jacobian norm can make the ODE system more ill-conditioned, contrary to what is believed in the literature. As an alternative, we regularise the condition number of the Jacobian and observe a lower number of function evaluations without a significant decrease in generalisation performance. We also find that Jacobian regularisation does not guarantee adversarial robustness, but it can lead to larger margin classifiers.
Keywords
- Neural ODEs
- Regularisation
- Sensitivity
This work is based on research supported by the National Research Foundation of South Africa (grant number 138341).
This is a preview of subscription content, access via your institution.
Buying options










References
Bai, S., Kolter, J.Z., Koltun, V.: Deep equilibrium models. In: Advances in Neural Information Processing Systems, vol. 32, pp. 690–701 (2019)
Bai, S., Koltun, V., Kolter, J.Z.: Stabilizing equilibrium models by Jacobian regularization. In: International Conference on Machine Learning, vol. 139, pp. 554–565 (2021)
Chen, R.T.: Torchdiffeq: PyTorch implementation of differentiable ODE solvers (2018). https://github.com/rtqichen/torchdiffeq
Chen, R.T., Rubanova, Y., Bettencourt, J., Duvenaud, D.K.: Neural ordinary differential equations. In: Advances in Neural Information Processing Systems, vol. 31, pp. 6572–6583 (2018)
Coddington, E.A., Levinson, N.: Theory of Ordinary Differential Equations. Tata McGraw-Hill Education (1955)
De Brouwer, E., Simm, J., Arany, A., Moreau, Y.: GRU-ODE-Bayes: continuous modeling of sporadically-observed time series. In: Advances in Neural Information Processing Systems, vol. 32, pp. 7377–7388 (2019)
Dupont, E., Doucet, A., Teh, Y.W.: Augmented neural ODEs. In: Advances in Neural Information Processing Systems, vol. 32, pp. 3134–3144 (2019)
Falorsi, L., Forré, P.: Neural ordinary differential equations on manifolds. arXiv:2006.06663 (2020)
Finlay, C., Jacobsen, J.H., Nurbekyan, L., Oberman, A.: How to train your neural ODE: the world of Jacobian and kinetic regularization. In: International Conference on Machine Learning, vol. 119, pp. 3154–3164 (2020)
Goodfellow, I., Shlens, J., Szegedy, C.: Explaining and harnessing adversarial examples. In: International Conference on Learning Representations (2015)
Gu, S., Rigazio, L.: Towards deep neural network architectures robust to adversarial examples. In: International Conference on Learning Representations, ICLR Workshop Track Proceedings (2015)
Hanshu, Y., Jiawei, D., Vincent, T., Jiashi, F.: On robustness of neural ordinary differential equations. In: International Conference on Learning Representations (2020)
Hoffman, J., Roberts, D.A., Yaida, S.: Robust learning with Jacobian regularization. arXiv:1908.02729 (2019)
Johansson, A., Strannegård, C., Engsner, N., Mostad, P.: Exact spectral norm regularization for neural networks. arXiv:2206.13581 (2022)
Kelly, J., Bettencourt, J., Johnson, M.J., Duvenaud, D.K.: Learning differential equations that are easy to solve. In: Advances in Neural Information Processing Systems, vol. 33, pp. 4370–4380 (2020)
Kidger, P., Morrill, J., Foster, J., Lyons, T.: Neural controlled differential equations for irregular time series. In: Advances in Neural Information Processing Systems, vol. 33, pp. 6696–6707 (2020)
Lou, A., et al.: Neural manifold ordinary differential equations. In: Advances in Neural Information Processing Systems, vol. 33, pp. 17548–17558 (2020)
Massaroli, S., Poli, M., Park, J., Yamashita, A., Asama, H.: Dissecting neural ODEs. In: Advances in Neural Information Processing Systems, vol. 33, pp. 3952–3963 (2020)
Mathieu, E., Nickel, M.: Riemannian continuous normalizing flows. In: Advances in Neural Information Processing Systems, vol. 33, pp. 2503–2515 (2020)
Miles, C., Sam, G., Stephan, H., Peter, B., David, S., Shirley, H.: Lagrangian neural networks. In: International Conference on Learning Representations, Workshop on Integration of Deep Neural Models and Differential Equations (2020)
Rubanova, Y., Chen, R.T., Duvenaud, D.K.: Latent ordinary differential equations for irregularly-sampled time series. In: Advances in Neural Information Processing Systems, vol. 32, pp. 5321–5331 (2019)
Sanchez-Gonzalez, A., Bapst, V., Cranmer, K., Battaglia, P.: Hamiltonian graph networks with ODE integrators. arXiv:1909.12790 (2019)
Sokolić, J., Giryes, R., Sapiro, G., Rodrigues, M.R.: Robust large margin deep neural networks. IEEE Trans. Signal Process. 65(16), 4265–4280 (2017)
Song, Y., Sohl-Dickstein, J., Kingma, D.P., Kumar, A., Ermon, S., Poole, B.: Score-based generative modeling through stochastic differential equations. In: International Conference on Learning Representations (2020)
Wang, W., Axelrod, S., Gómez-Bombarelli, R.: Differentiable molecular simulations for control and learning. In: International Conference on Learning Representations, Workshop on Integration of Deep Neural Models and Differential Equations (2020)
Yoshida, Y., Miyato, T.: Spectral norm regularization for improving the generalizability of deep learning. arXiv:1705.10941 (2017)
Zhong, Y.D., Dey, B., Chakraborty, A.: Symplectic ODE-net: learning Hamiltonian dynamics with control. In: International Conference on Learning Representations (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Josias, S., Brink, W. (2022). Jacobian Norm Regularisation and Conditioning in Neural ODEs. In: Pillay, A., Jembere, E., Gerber, A. (eds) Artificial Intelligence Research. SACAIR 2022. Communications in Computer and Information Science, vol 1734. Springer, Cham. https://doi.org/10.1007/978-3-031-22321-1_3
Download citation
DOI: https://doi.org/10.1007/978-3-031-22321-1_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-22320-4
Online ISBN: 978-3-031-22321-1
eBook Packages: Computer ScienceComputer Science (R0)