Charles Explorer logo
🇬🇧

Deep Learning for Natural Language Parsing

Publication

Abstract

Natural language processing problems (such as speech recognition, text-based data mining, and text or speech generation) are becoming increasingly important. Before effectively approaching many of these problems, it is necessary to process the syntactic structures of the sentences.

Syntactic parsing is the task of constructing a syntactic parse tree over a sentence which describes the structure of the sentence. Parse trees are used as part of many language processing applications.

In this paper, we present a multi-lingual dependency parser. Using advanced deep learning techniques, our parser architecture tackles common issues with parsing such as long-distance head attachment, while using `architecture engineering' to adapt to each target language in order to reduce the feature engineering often required for parsing tasks.

We implement a parser based on this architecture to utilize transfer learning techniques to address important issues related with limited-resourced language. We exceed the accuracy of state-of-the-art parsers on languages with limited training resources by a considerable margin.

We present promising results for solving core problems in natural language parsing, while also performing at state-of-the-art accuracy on general parsing tasks.