Sum-Product Networks and Normalizing Flows for Tractable Density Estimation


Keywords
deep-learning, density-estimation, probabilistic-models, normalizing-flows, sum-product-networks
License
MIT
Install
pip install spnflow==0.5.2

Documentation

MIT license PyPI version

DeeProb-kit

Abstract

DeeProb-kit is a Python library that implements deep probabilistic models such as various kinds of Sum-Product Networks, Normalizing Flows and their possible combinations for probabilistic inference. Some models are implemented using PyTorch for fast training and inference on GPUs.

Features

  • Inference algorithms for SPNs. [1] [2] [4]
  • Learning algorithms for SPNs structure. [1] [2] [3] [4]
  • Chow-Liu Trees (CLT) as SPN leaves. [10] [11]
  • Batch Expectation-Maximization (EM) for SPNs with arbitrarily leaves. [12] [13]
  • Optimization of the structure of SPNs. [4]
  • JSON I/O operations for SPNs. [4]
  • Implementation of RAT-SPN using PyTorch. [5]
  • Implementation of MAFs and Real-NVPs using PyTorch. [6] [7] [8]
  • Implementation of Deep Generalized Convolutional SPNs (DGC-SPNs). [9]

Documentation

The library documentation is hosted using GitHub Pages at deeprob-kit.

Experiments

The datasets required to run the experiments can be found on Google Drive. After downloading it, unzip it in experiments/datasets to be able to run the experiments.

Examples

Various code examples can be found in examples directory.

Related Repositories

References

  1. On Theoretical Properties of Sum-Product Networks (Peharz et al.).
  2. Sum-Product Networks: A New Deep Architecture (Poon and Domingos).
  3. Mixed Sum-Product Networks: A Deep Architecture for Hybrid Domains (Molina, Vergari et al.).
  4. SPFLOW : An easy and extensible library for deep probabilistic learning using Sum-Product Networks (Molina, Vergari et al.).
  5. Probabilistic Deep Learning using Random Sum-Product Networks (Peharz et al.).
  6. Masked Autoregressive Flow for Density Estimation (Papamakarios et al.).
  7. Density Estimation using RealNVP (Dinh et al.).
  8. Normalizing Flows for Probabilistic Modeling and Inference (Papamakarios, Nalisnick et al.).
  9. Deep Generalized Convolutional Sum-Product Networks for Probabilistic Image Representations (Van de Wolfshaar and Pronobis).
  10. Cutset Networks: A Simple, Tractable, and Scalable Approach for Improving the Accuracy of Chow-Liu Trees (Rahman et al.).
  11. Random Probabilistic Circuits (Di Mauro, Gala et al.).
  12. Learning Arbitrary Sum-Product Network Leaves with Expectation-Maximization (Desana and Schnörr).
  13. Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits (Peharz et al.).