torch-multi-head-attention

Multi-head attention implemented in PyTorch


Keywords
attention, multi-head, pytorch
License
MIT
Install
pip install torch-multi-head-attention==0.15.1

Documentation

PyTorch Multi-Head Attention

Travis Coverage

Install

pip install torch-multi-head-attention

Usage

from torch_multi_head_attention import MultiHeadAttention

MultiHeadAttention(in_features=768, head_num=12)