Long Short-Term Memory as a Dynamically Computed Element-wise Weighted Sum - ACL Anthology

Long Short-Term Memory as a Dynamically Computed Element-wise Weighted Sum

Omer Levy, Kenton Lee, Nicholas FitzGerald, Luke Zettlemoyer


Abstract
LSTMs were introduced to combat vanishing gradients in simple RNNs by augmenting them with gated additive recurrent connections. We present an alternative view to explain the success of LSTMs: the gates themselves are versatile recurrent models that provide more representational power than previously appreciated. We do this by decoupling the LSTM’s gates from the embedded simple RNN, producing a new class of RNNs where the recurrence computes an element-wise weighted sum of context-independent functions of the input. Ablations on a range of problems demonstrate that the gating mechanism alone performs as well as an LSTM in most settings, strongly suggesting that the gates are doing much more in practice than just alleviating vanishing gradients.
Anthology ID:
P18-2116
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
732–739
Language:
URL:
https://aclanthology.org/P18-2116
DOI:
10.18653/v1/P18-2116
Bibkey:
Cite (ACL):
Omer Levy, Kenton Lee, Nicholas FitzGerald, and Luke Zettlemoyer. 2018. Long Short-Term Memory as a Dynamically Computed Element-wise Weighted Sum. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 732–739, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Long Short-Term Memory as a Dynamically Computed Element-wise Weighted Sum (Levy et al., ACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/P18-2116.pdf
Video:
 https://aclanthology.org/P18-2116.mp4