DeeProb-kit
Abstract
DeeProb-kit is a Python library that implements deep probabilistic models such as various kinds of Sum-Product Networks, Normalizing Flows and their possible combinations for probabilistic inference. Some models are implemented using PyTorch for fast training and inference on GPUs.
Features
- Inference algorithms for SPNs. [1] [2] [4]
- Learning algorithms for SPNs structure. [1] [2] [3] [4]
- Chow-Liu Trees (CLT) as SPN leaves. [10] [11]
- Batch Expectation-Maximization (EM) for SPNs with arbitrarily leaves. [12] [13]
- Optimization of the structure of SPNs. [4]
- JSON I/O operations for SPNs. [4]
- Implementation of RAT-SPN using PyTorch. [5]
- Implementation of MAFs and Real-NVPs using PyTorch. [6] [7] [8]
- Implementation of Deep Generalized Convolutional SPNs (DGC-SPNs). [9]
Documentation
The library documentation is hosted using GitHub Pages at deeprob-kit.
Experiments
The datasets required to run the experiments can be found on Google Drive.
After downloading it, unzip it in experiments/datasets
to be able to run the experiments.
Examples
Various code examples can be found in examples
directory.
Related Repositories
References
- On Theoretical Properties of Sum-Product Networks (Peharz et al.).
- Sum-Product Networks: A New Deep Architecture (Poon and Domingos).
- Mixed Sum-Product Networks: A Deep Architecture for Hybrid Domains (Molina, Vergari et al.).
- SPFLOW : An easy and extensible library for deep probabilistic learning using Sum-Product Networks (Molina, Vergari et al.).
- Probabilistic Deep Learning using Random Sum-Product Networks (Peharz et al.).
- Masked Autoregressive Flow for Density Estimation (Papamakarios et al.).
- Density Estimation using RealNVP (Dinh et al.).
- Normalizing Flows for Probabilistic Modeling and Inference (Papamakarios, Nalisnick et al.).
- Deep Generalized Convolutional Sum-Product Networks for Probabilistic Image Representations (Van de Wolfshaar and Pronobis).
- Cutset Networks: A Simple, Tractable, and Scalable Approach for Improving the Accuracy of Chow-Liu Trees (Rahman et al.).
- Random Probabilistic Circuits (Di Mauro, Gala et al.).
- Learning Arbitrary Sum-Product Network Leaves with Expectation-Maximization (Desana and Schnörr).
- Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits (Peharz et al.).