@epfml/disco-server

This project contains the helper server providing the APIs used by the decentralized and federated learning schemes available in `@epfml/discojs` and `@epfml/discojs-node`.


Keywords
browser, collaborative-learning, decentralized-learning, federated-learning, machine-learning, mobile, privacy-preserving
License
ISC
Install
npm install @epfml/disco-server@2.1.1

Documentation

DISCO - DIStributed COllaborative Machine Learning

DISCO leverages federated 🌟 and decentralized ✨ learning to allow several data owners to collaboratively build machine learning models without sharing any original data.

The latest version is always running on the following link, directly in your browser, for web and mobile:

🕺 https://discolab.ai/ 🕺


🪄 DEVELOPERS: DISCO is written fully in JavaScript/TypeScript. Have a look at our developer guide.


WHY DISCO?

  • To build deep learning models across private datasets without compromising data privacy, ownership, sovereignty, or model performance
  • To create an easy-to-use platform that allows non-specialists to participate in collaborative learning

⚙️ HOW DISCO WORKS

  • DISCO has a public model – private data approach
  • Private and secure model updates – not data – are communicated to either:
    • a central server : federated learning ( 🌟 )
    • directly between users : decentralized learning ( ✨ ) i.e. no central coordination
  • Model updates are then securely aggregated into a trained model
  • See more HERE

DISCO TECHNOLOGY

  • DISCO runs arbitrary deep learning tasks and model architectures in your browser, via TF.js
  • Decentralized learning ✨ relies on peer2peer communication
  • Have a look at how DISCO ensures privacy and confidentiality HERE

🧪 RESEARCH-BASED DESIGN

DISCO leverages latest research advances, enabling open-access and easy-use distributed training which is

  • 🔒 privacy-preserving (R1)
  • 🛠️ dynamic and asynchronous over time (R2, R7)
  • 🥷 robust to malicious actors (R3 (partially))

And more on the roadmap

  • 🌪️ efficient (R4, R5)
  • 🔒 privacy-preserving while Byzantine robust (R6)
  • 🥷 resistant to data poisoning (R8)
  • 🍎 🍌 interpretable in imperfectly interoperable data distributions (R9)
  • 🪞 personalizable (R10)
  • 🥕 fairly incentivizing participation

🏁 HOW TO USE DISCO

  • Start by exploring our examples tasks in the DISCOllaboratives page.
  • The example DISCOllaboratives are based on popular ML tasks such as GPT2, Titanic, MNIST or CIFAR-10
  • It is also possible to create your own DISCOllaboratives without coding on the custom training page:
    • Upload the initial model
    • Choose between federated and decentralized for your DISCO training scheme ... connect your data and... done! 📊
    • For more details on ML tasks and custom training have a look at this guide