zh-sentence
A light-weight sentence tokenizer for Chinese languages.
Installation
pip install zh-sentence
Sample
from zh_sentence.tokenizer import tokenize
paragraph_str = "你好吗?你快乐吗?"
sentence_list = tokenize(paragraph_str)
for sentence in sentence_list:
print(sentence)
Other languages
JavaScript -> https://github.com/Rairye/js-sentence-tokenizers