Memory efficient inverible layers, networks and activation function for Machine learning.

InvertibleNetworks.jl documentation

This documentation is work in progress and is being actively populated.

About

InvertibleNetworks.jl is a package of invertible layers and networks for machine learning. The invertibility allow to backpropagate through the layers and networks without the need for storing the forward state that is recomputed on the fly, inverse propagating through it. This package is the first of its kind in Julia.

This package is developped and maintained by Felix J. Herrmann's SlimGroup at Georgia Institute of Technology. In particular the main contributors of this package are:

  • Philipp Witte, Microsoft Corporation (pwitte@microsoft.com)
  • Gabrio Rizzuti, Utrecht University (g.rizzuti@umcutrecht.nl)
  • Mathias Louboutin, Georgia Institute of Technology (mlouboutin3@gatech.edu)
  • Ali Siahkoohi, Georgia Institute of Technology (alisk@gatech.edu)

Installation

THis package is registered in the Julia general registry and can be directly installed in the julia REPL package manager (]):

] add/dev InvertibleNetworks

References

  • Yann Dauphin, Angela Fan, Michael Auli and David Grangier, "Language modeling with gated convolutional networks", Proceedings of the 34th International Conference on Machine Learning, 2017. ArXiv

  • Laurent Dinh, Jascha Sohl-Dickstein and Samy Bengio, "Density estimation using Real NVP", International Conference on Learning Representations, 2017, ArXiv

  • Diederik P. Kingma and Prafulla Dhariwal, "Glow: Generative Flow with Invertible 1x1 Convolutions", Conference on Neural Information Processing Systems, 2018. ArXiv

  • Keegan Lensink, Eldad Haber and Bas Peters, "Fully Hyperbolic Convolutional Neural Networks", arXiv Computer Vision and Pattern Recognition, 2019. ArXiv

  • Patrick Putzky and Max Welling, "Invert to learn to invert", Advances in Neural Information Processing Systems, 2019. ArXiv

  • Jakob Kruse, Gianluca Detommaso, Robert Scheichl and Ullrich Köthe, "HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference", arXiv Statistics and Machine Learning, 2020. ArXiv

The following publications use [InvertibleNetworks.jl]:

Acknowledgments

This package uses functions from NNlib.jl, Flux.jl and Wavelets.jl