@epfml/disco-server

This project contains the helper server providing the APIs used by the decentralized and federated learning schemes available in `@epfml/discojs` and `@epfml/discojs-node`.


Keywords
browser, collaborative-learning, decentralized-learning, federated-learning, machine-learning, mobile, privacy-preserving
License
ISC
Install
npm install @epfml/disco-server@2.1.1

Documentation

DISCO - DIStributed COllaborative Machine Learning

DISCO leverages federated ๐ŸŒŸ and decentralized โœจ learning to allow several data owners to collaboratively build machine learning models without sharing any original data.

The latest version is always running on the following link, directly in your browser, for web and mobile:

๐Ÿ•บ https://discolab.ai/ ๐Ÿ•บ


๐Ÿช„ DEVELOPERS: DISCO is written fully in JavaScript/TypeScript. Have a look at our developer guide.


โ“ WHY DISCO?

  • To build deep learning models across private datasets without compromising data privacy, ownership, sovereignty, or model performance
  • To create an easy-to-use platform that allows non-specialists to participate in collaborative learning

โš™๏ธ HOW DISCO WORKS

  • DISCO has a public model โ€“ private data approach
  • Private and secure model updates โ€“ not data โ€“ are communicated to either:
    • a central server : federated learning ( ๐ŸŒŸ )
    • directly between users : decentralized learning ( โœจ ) i.e. no central coordination
  • Model updates are then securely aggregated into a trained model
  • See more HERE

โ“ DISCO TECHNOLOGY

  • DISCO runs arbitrary deep learning tasks and model architectures in your browser, via TF.js
  • Decentralized learning โœจ relies on peer2peer communication
  • Have a look at how DISCO ensures privacy and confidentiality HERE

๐Ÿงช RESEARCH-BASED DESIGN

DISCO leverages latest research advances, enabling open-access and easy-use distributed training which is

  • ๐Ÿ”’ privacy-preserving (R1)
  • ๐Ÿ› ๏ธ dynamic and asynchronous over time (R2, R7)
  • ๐Ÿฅท robust to malicious actors (R3 (partially))

And more on the roadmap

  • ๐ŸŒช๏ธ efficient (R4, R5)
  • ๐Ÿ”’ privacy-preserving while Byzantine robust (R6)
  • ๐Ÿฅท resistant to data poisoning (R8)
  • ๐ŸŽ ๐ŸŒ interpretable in imperfectly interoperable data distributions (R9)
  • ๐Ÿชž personalizable (R10)
  • ๐Ÿฅ• fairly incentivizing participation

๐Ÿ HOW TO USE DISCO

  • Start by exploring our examples tasks in the DISCOllaboratives page.
  • The example DISCOllaboratives are based on popular ML tasks such as GPT2, Titanic, MNIST or CIFAR-10
  • It is also possible to create your own DISCOllaboratives without coding on the custom training page:
    • Upload the initial model
    • Choose between federated and decentralized for your DISCO training scheme ... connect your data and... done! ๐Ÿ“Š
    • For more details on ML tasks and custom training have a look at this guide