🐭
RATransformers
RATransformers, short for Relation-Aware Transformers, is a package built on top of transformers
Example - Encoding a table in TableQA (Question Answering on Tabular Data)
In this example we can see that passing the table as text with no additional information to the model is a poor representation.
With RATransformers
Check more examples in [here].
Installation
Install directly from PyPI:
pip install ratransformers
Usage
from ratransformers import RATransformer
from transformers import AutoModelForSequenceClassification
ratransformer = RATransformer(
"nielsr/tapex-large-finetuned-tabfact", # define the 🤗 model you want to load
relation_kinds=['is_value_of_column', 'is_from_same_row'], # define the relations that you want to model in the input
model_cls=AutoModelForSequenceClassification, # define the model class
pretrained_tokenizer_name_or_path='facebook/bart-large' # define the tokenizer you want to load (in case it is not the same as the model)
)
model = ratransformer.model
tokenizer = ratransformer.tokenizer
With only these steps your RATransformer
More implementation details in the examples here.
How does it work?
We modify the self-attention layers of the transformer model as explained in the section 3 of the RAT-SQL paper.
Supported Models
Currently we support a limited number of transformer models:
Want another model? Feel free to open an Issue or create a Pull Request and let's get started