diffdist

Provides differentiable communication primitives such as Send/Recv/Gather etc for pytorch. Autograd can backpropagate through those, making distributed model parallel models easy to implement.


Keywords
torch, distributed, autograd, model, parallel, differentiable, backpropagate
License
GPL-3.0
Install
pip install diffdist==0.1

Documentation

diffdist

diffdist is a python library for pytorch. It extends the default functionality of torch.autograd and adds support for differentiable communication between processes. This enables backpropagation to work in distributed settings and makes it super easy to use distributed model parallelism!

Installation

After installing pytorch install simply using:

$ pip install diffdist

License

GNU GPLv3