allennlp-semparse

A framework for building semantic parsers (including neural module networks) with AllenNLP, built by the authors of AllenNLP


Keywords
allennlp, NLP, deep, learning, machine, reading, semantic, parsing, parsers, semantic-parsers
License
Apache-2.0
Install
pip install allennlp-semparse==0.0.4

Documentation

allennlp-semparse

Build status PyPI codecov

A framework for building semantic parsers (including neural module networks) with AllenNLP, built by the authors of AllenNLP

Installing

allennlp-semparse is available on PyPI. You can install through pip with

pip install allennlp-semparse

Supported datasets

  • ATIS
  • Text2SQL
  • NLVR
  • WikiTableQuestions

Supported models

  • Grammar-based decoding models, following the parser originally introduced in Neural Semantic Parsing with Type Constraints for Semi-Structured Tables. The models that are currently checked in are all based on this parser, applied to various datasets.
  • Neural module networks. We don't have models checked in for this yet, but DomainLanguage supports defining them, and we will add some models to the repo once papers go through peer review. The code is slow (batching is hard), but it works.

Tutorials

Coming sometime in the future... You can look at this old tutorial, but the part about using NLTK to define a grammar is outdated. Now you can use DomainLanguage to define a python executor, and we analyze the type annotations in the functions in that executor to automatically infer a grammar for you. It is much easier to use than it used to be. Until we get around to writing a better tutorial for this, the best way to get started using this is to look at some examples. The simplest is the Arithmetic language in the DomainLanguage test (there's also a bit of description in the DomainLanguage docstring). After looking at those, you can look at more complex (real) examples in the domain_languages module. Note that the executor you define can have learned parameters, making it a neural module network. The best place to get an example of that is currently this unfinished implementation of N2NMNs on the CLEVR dataset. We'll have more examples of doing this in the not-too-distant future.