Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Solve a variety of tasks with pre-trained models or finetune them in one line for your own tasks.
Out of the box tasks you can solve with Backprop:
- Conversational question answering in English
- Text Classification in 100+ languages
- Image Classification
- Text Vectorisation in 50+ languages
- Image Vectorisation
- Summarisation in English
- Emotion detection in English
- Text Generation
For more specific use cases, you can adapt a task with little data and a single line of code via finetuning.
⚡ Getting started | Installation, few minute introduction |
---|---|
💡 Examples | Finetuning and usage examples |
📙 Docs | In-depth documentation about task inference and finetuning |
⚙️ Models | Overview of available models |
Install Backprop via PyPi:
pip install backprop
Tasks act as interfaces that let you easily use a variety of supported models.
import backprop
context = "Take a look at the examples folder to see use cases!"
qa = backprop.QA()
# Start building!
answer = qa("Where can I see what to build?", context)
print(answer)
# Prints
"the examples folder"
You can run all tasks and models on your own machine, or in production with our inference API, simply by specifying your api_key
.
See how to use all available tasks.
Each task implements finetuning that lets you adapt a model for your specific use case in a single line of code.
A finetuned model is easy to upload to production, letting you focus on building great applications.
import backprop
tg = backprop.TextGeneration("t5-small")
# Any text works as training data
inp = ["I really liked the service I received!", "Meh, it was not impressive."]
out = ["positive", "negative"]
# Finetune with a single line of code
tg.finetune({"input_text": inp, "output_text": out})
# Use your trained model
prediction = tg("I enjoyed it!")
print(prediction)
# Prints
"positive"
# Upload to Backprop for production ready inference
# Describe your model
name = "t5-sentiment"
description = "Predicts positive and negative sentiment"
tg.upload(name=name, description=description, api_key="abc")
See finetuning for other tasks.
-
No experience needed
- Entrance to practical AI should be simple
- Get state-of-the-art performance in your task without being an expert
-
Data is a bottleneck
- Solve real world tasks without any data
- With transfer learning, even a small amount of data can adapt a task to your niche requirements
-
There are an overwhelming amount of models
- We offer a curated selection of the best open-source models and make them simple to use
- A few general models can accomplish more with less optimisation
-
Deploying models cost effectively is hard work
- If our models suit your use case, no deployment is needed: just call our API
- Adapt and deploy your own model with just a few lines of code
- Our API scales, is always available, and you only pay for usage
- Solve any text based task with Finetuning (Github, Colab)
- Search for images using text (Github)
- Finding answers from text (Github)
- More finetuning and task examples
Check out our docs for in-depth task inference and finetuning.
Curated list of state-of-the-art models.
Zero-shot image classification with CLIP.
Backprop relies on many great libraries to work, most notably:
Found a bug or have ideas for new tasks and models? Open an issue.