fast-sentence-tokenize

Fast and Efficient Sentence Tokenization


Keywords
nlp, nlu, text, classify, classification
License
Other
Install
pip install fast-sentence-tokenize==0.1.15