As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
In this work, we study on semi-supervised semantic parsing under a multi-task learning framework to alleviate limited performance caused by limited annotated data. Two novel strategies are proposed to leverage unlabeled natural language utterances. The first one takes entity predicate sequences as training targets to enhance representation learning. The second one extends Mean Teacher to seq2seq model and generates more target-side data to improve the generalizability of decoder network. Different from original Mean Teacher, our strategy produces hard targets for the student decoder and update the decoder weights instead of the whole model. Experiments demonstrate that our proposed methods significantly outperform the supervised baseline and achieve more impressive improvement than previous methods.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.