self-attentive-parser

1 · · Aug. 22, 2022, 1:11 a.m.
self-attentive-parser High-accuracy NLP parser with models for 11 languages. A high-accuracy parser with models for 11 languages, implemented in Python. Based on Constituency Parsing with a Self-Attentive Encoder from ACL 2018, with additional changes described in Multilingual Constituency Parsing with Self-Attention and Pre-Training. New February 2021: Version 0.2.0 of the Berkeley Neural Parser is now out, with higher-quality pre-trained models for all languages. Inference now uses PyTorch ins...