Skip to yearly menu bar Skip to main content


Poster

Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks

Yikang Shen · Shawn Tan · Alessandro Sordoni · Aaron Courville

Great Hall BC #13

Keywords: [ deep learning ] [ natural language processing ] [ recurrent neural networks ] [ language modeling ]


Abstract:

Natural language is hierarchically structured: smaller units (e.g., phrases) are nested within larger units (e.g., clauses). When a larger constituent ends, all of the smaller constituents that are nested within it must also be closed. While the standard LSTM architecture allows different neurons to track information at different time scales, it does not have an explicit bias towards modeling a hierarchy of constituents. This paper proposes to add such a constraint to the system by ordering'' the neurons; a vector ofmaster'' input and forget gates ensure that when a given neuron is updated, all of the neurons that follow it in the ordering are also updated. Our novel RNN unit, ON-LSTM, achieves good performance on four different tasks: language modeling, unsupervised parsing, targeted syntactic evaluation, and logical inference.

Live content is unavailable. Log in and register to view live content