{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,8,7]],"date-time":"2024-08-07T13:10:33Z","timestamp":1723036233602},"publisher-location":"California","reference-count":0,"publisher":"International Joint Conferences on Artificial Intelligence Organization","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2021,8]]},"abstract":"Stylized neural machine translation (NMT) aims to translate sentences of one style into sentences of another style, which is essential for the application of machine translation in a real-world scenario. However, a major challenge in this task is the scarcity of high-quality parallel data which is stylized paired. To address this problem, we propose an iterative dual knowledge transfer framework that utilizes informal training data of machine translation and formality style transfer data to create large-scale stylized paired data, for the training of stylized machine translation model. Specifically, we perform bidirectional knowledge transfer between translation model and text style transfer model iteratively through knowledge distillation. Then, we further propose a data-refinement module to process the noisy synthetic parallel data generated during knowledge transfer. Experiment results demonstrate the effectiveness of our method, achieving an improvement over the existing best model by 5 BLEU points on MTFC dataset. Meanwhile, extensive analyses illustrate our method can also improve the accuracy of formality style transfer.<\/jats:p>","DOI":"10.24963\/ijcai.2021\/547","type":"proceedings-article","created":{"date-parts":[[2021,8,11]],"date-time":"2021-08-11T11:00:49Z","timestamp":1628679649000},"page":"3971-3977","source":"Crossref","is-referenced-by-count":1,"title":["Improving Stylized Neural Machine Translation with Iterative Dual Knowledge Transfer"],"prefix":"10.24963","author":[{"given":"Xuanxuan","family":"Wu","sequence":"first","affiliation":[{"name":"Beijing Jiaotong University"}]},{"given":"Jian","family":"Liu","sequence":"additional","affiliation":[{"name":"Beijing Jiaotong University"}]},{"given":"Xinjie","family":"Li","sequence":"additional","affiliation":[{"name":"Global Tone Communication Technology Co., Ltd."}]},{"given":"Jinan","family":"Xu","sequence":"additional","affiliation":[{"name":"Beijing Jiaotong University"}]},{"given":"Yufeng","family":"Chen","sequence":"additional","affiliation":[{"name":"Beijing Jiaotong University"}]},{"given":"Yujie","family":"Zhang","sequence":"additional","affiliation":[{"name":"Beijing Jiaotong University"}]},{"given":"Hui","family":"Huang","sequence":"additional","affiliation":[{"name":"Beijing Jiaotong University"}]}],"member":"10584","event":{"number":"30","sponsor":["International Joint Conferences on Artificial Intelligence Organization (IJCAI)"],"acronym":"IJCAI-2021","name":"Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}","start":{"date-parts":[[2021,8,19]]},"theme":"Artificial Intelligence","location":"Montreal, Canada","end":{"date-parts":[[2021,8,27]]}},"container-title":["Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence"],"original-title":[],"deposited":{"date-parts":[[2021,8,11]],"date-time":"2021-08-11T11:03:59Z","timestamp":1628679839000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.ijcai.org\/proceedings\/2021\/547"}},"subtitle":[],"proceedings-subject":"Artificial Intelligence Research Articles","short-title":[],"issued":{"date-parts":[[2021,8]]},"references-count":0,"URL":"https:\/\/doi.org\/10.24963\/ijcai.2021\/547","relation":{},"subject":[],"published":{"date-parts":[[2021,8]]}}}