Multilingual syllable annotation pipeline component for spacy

spacy, pyphen, syllable, syllables
pip install spacy-syllables==3.0.0


spacy syllables

Spacy Syllables

Build Status Latest Version Python Support

A spacy 2+ pipeline component for adding multilingual syllable annotation to tokens.

  • Uses well established pyphen for the syllables.
  • Supports a ton of languages
  • Ease of use thx to the awesome pipeline framework in spacy


$ pip install spacy_syllables

which also installs the following dependencies:

  • spacy = "^2.2.3"
  • pyphen = "^0.9.5"


The SpacySyllables class autodetects language from the given spacy nlp instance, but you can also override the detected language by specifying the lang parameter during instantiation, see how here.

Normal usecase

import spacy
from spacy_syllables import SpacySyllables

nlp = spacy.load("en_core_web_sm")

syllables = SpacySyllables(nlp)

nlp.add_pipe(syllables, after="tagger")

assert nlp.pipe_names == ["tagger", "syllables", "parser", "ner"]

doc = nlp("terribly long")

data = [(token.text, token._.syllables, token._.syllables_count) for token in doc]

assert data == [("terribly", ["ter", "ri", "bly"], 3), ("long", ["long"], 1)]

more examples in tests

Dev setup / testing

we are using

  • poetry for the package
  • nox for the tests
  • pyenv for specifying python versions for nox tests


then install the dev package and pyenv versions

$ poetry install
$ poetry --session install_pyenv_versions

run tests

$ poetry run nox