Sentence-Level Agreement For Neural Machine Translation

By 7 oktober 2021Geen categorie

Multi-headed attention with regularization of dissent. Jian Li, Zhaopeng Tu, Baosong Yang, Michael R. Lyu, Tong Zhang. EMNLP 2018. Paper Machine translation is a very interesting search! I hope you can do something good that you need (how)! If you are interested in machine translation and have any questions, you can contact me! Asynchronous two-way decoding for neural machine translation. Xiangwen Zhang, Jinsong Su, Yue Qin, Yang Liu, Rongrong Ji, Hongji Wang. YYYY 2018. Imitation Learning paper for non-authoritarian neural machine translation. Bingzhen Wei, Mingxuan Wang, Hao Zhou, Junyang Lin, Xu Sun. ACL 2019. Paper Modeling Coherence for Neural Machine Translation with Dynamic and Topic Caches.

Shaohui Kuang, Deyi Xiong, Weihua Luo, Guodong Zhou. COLING 2018. Paper Leveraging Local and Global Patterns for Self-Attention Networks. Mingzhou Xu, Derek F. Wong, Baosong Yang, Yue Zhang, Lidia S. Chao. ACL 2019. Paper code sharing modes for multilingual self-care templates. Devendra Singh Sachan, Graham Neubig. WMT 2018. Paper Network of truck transformers for machine translation.

Karim Ahmed, Nitish Shirish Keskar, Richard Socher. Arxiv 2019. Paper on which you might not need any attention. Ofir Press, Noah A. Smith. 2018. Paper Translation neural multi-source. Barret Zoph and Kevin Knight.

NAACL 2016. Paper. Sequence to Sequence Learning with neural networks. Ilya Sutskever, Oriol Vinyals, Quoc V. Le. NIPS 2014. Paper Exploiting Sential Context for Neural Machine Translation. Xing Wang, Zhaopeng Tu, Longyue Wang, Shuming Shi. ACL 2019. Tearless paper transformers: improving the normalization of self-attention. Toan Q.

Nguyen, Julian Salazar. IWSLT 2019. Convolutional Sequence to Sequence Learning paper code. Jonas Gehring, Michael Auli, David Grangier, Denis Yarats, Yann N. Dauphin. ICML 2017. .