CKY-based Convolutional Attention for Neural Machine Translation - ACL Anthology

CKY-based Convolutional Attention for Neural Machine Translation

Taiki Watanabe, Akihiro Tamura, Takashi Ninomiya


Abstract
This paper proposes a new attention mechanism for neural machine translation (NMT) based on convolutional neural networks (CNNs), which is inspired by the CKY algorithm. The proposed attention represents every possible combination of source words (e.g., phrases and structures) through CNNs, which imitates the CKY table in the algorithm. NMT, incorporating the proposed attention, decodes a target sentence on the basis of the attention scores of the hidden states of CNNs. The proposed attention enables NMT to capture alignments from underlying structures of a source sentence without sentence parsing. The evaluations on the Asian Scientific Paper Excerpt Corpus (ASPEC) English-Japanese translation task show that the proposed attention gains 0.66 points in BLEU.
Anthology ID:
I17-2001
Volume:
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
November
Year:
2017
Address:
Taipei, Taiwan
Editors:
Greg Kondrak, Taro Watanabe
Venue:
IJCNLP
SIG:
Publisher:
Asian Federation of Natural Language Processing
Note:
Pages:
1–6
Language:
URL:
https://aclanthology.org/I17-2001
DOI:
Bibkey:
Cite (ACL):
Taiki Watanabe, Akihiro Tamura, and Takashi Ninomiya. 2017. CKY-based Convolutional Attention for Neural Machine Translation. In Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 1–6, Taipei, Taiwan. Asian Federation of Natural Language Processing.
Cite (Informal):
CKY-based Convolutional Attention for Neural Machine Translation (Watanabe et al., IJCNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/I17-2001.pdf
Data
ASPEC