happytransformer

Happy Transformer makes it easy to fine-tune NLP Transformer models and use them for inference.


Keywords
bert, roberta, ai, transformer, happy, HappyTransformer, classification, nlp, nlu, natural, language, processing, understanding, artificial-intelligence, deep-learning, language-models, machine-learning, natural-language-processing, python, question-answering, text-classification, transformers
License
Apache-2.0
Install
pip install happytransformer==1.0.4

Documentation

License Downloads Website shields.io PyPI

Happy Transformer

Documentation and news: happytransformer.com

Join our Discord server: Support Server

HappyTransformer

Happy Transformer makes it easy to fine-tune NLP Transformer models and use them for inference.

3.0.0

  1. Deepspeed for training
  2. Apple's MPS for training and inference
  3. WandB to track training runs
  4. Data supplied for training is automatically split into portions for training and evaluating
  5. Push models directly to Hugging Face's Model Hub

Read about the full 3.0.0 update including breaking changes here.

Tasks

Tasks Inference Training
Text Generation
Text Classification
Word Prediction
Question Answering
Text-to-Text
Next Sentence Prediction
Token Classification

Quick Start

pip install happytransformer
from happytransformer import HappyWordPrediction
#--------------------------------------#
happy_wp = HappyWordPrediction()  # default uses distilbert-base-uncased
result = happy_wp.predict_mask("I think therefore I [MASK]")
print(result)  # [WordPredictionResult(token='am', score=0.10172799974679947)]
print(result[0].token)  # am

Maintainers

Tutorials

Text generation with training (GPT-Neo)

Text classification (training)

Text classification (hate speech detection)

Text classification (sentiment analysis)

Word prediction with training (DistilBERT, RoBERTa)

Top T5 Models

Grammar Correction

Fine-tune a Grammar Correction Model