Gradients should stay on path: better estimators of the reverse- and forward KL divergence for normalizing flows

Vaitl, Lorenz and Nicoli, Kim A and Nakajima, Shinichi and Kessel, Pan (2022) Gradients should stay on path: better estimators of the reverse- and forward KL divergence for normalizing flows. Machine Learning: Science and Technology, 3 (4). 045006. ISSN 2632-2153

[thumbnail of Vaitl_2022_Mach._Learn.__Sci._Technol._3_045006.pdf] Text
Vaitl_2022_Mach._Learn.__Sci._Technol._3_045006.pdf - Published Version

Download (3MB)

Abstract

We show how to use the path-wise derivative estimator for both the forward reverse Kullback–Leibler divergence for any practically invertible normalizing flow. The resulting path-gradient estimators are straightforward to implement, have lower variance, and lead not only to faster convergence of training but also to better overall approximation results compared to standard total gradient estimators. We also demonstrate that path-gradient training is less susceptible to mode-collapse. In light of our results, we expect that path-gradient estimators will become the new standard method to train normalizing flows for variational inference.

Item Type: Article
Subjects: STM Library > Multidisciplinary
Depositing User: Managing Editor
Date Deposited: 12 Jul 2023 03:40
Last Modified: 11 Oct 2023 05:01
URI: http://open.journal4submit.com/id/eprint/2459

Actions (login required)

View Item
View Item