Deep Learning in Machine Translation
Neural machine translation (NMT), which aims to translate natural languages using neural networks, has attracted intensive attention in the recent two years. Unlike traditional statistical methods that rely on manual feature engineering, NMT is capable of directly learning representations from data and capturing long distance dependencies via the gating and attention mechanisms. This talk will introduce recent advances in NMT and our recent work on knowledge-guided, interpretable NMT for low-resource languages. The talk closes with a discussion about the challenges and future directions of NMT.
About Dr. Yang LIU
Yang Liu is an Associate Professor in the Department of Computer Science and Technology, Tsinghua University. He received his PhD degree from Institute of Computing Technology, Chinese Academy of Sciences in 2007. His research focuses on natural language processing and machine translation. He has published over 40 papers on leading NLP/AI journals and conferences such as Computational Linguistics, ACL, AAAI, EMNLP, and COLING. He won the ACL 2017 Outstanding Paper and COLING/ACL 2006 Meritorious Asian NLP Paper Award. He served as Associate Editor of ACM TALLIP, ACL 2014 Tutorial Co-Chair, ACL 2015 Local Arrangement Co-Chair, ACL 2016 SRW Faculty Advisor, EMNLP 2016/ACL 2017/ACL 2018 Area Chairs, and SIGHAN Information Officer.