Xiaoyu Shen

Assistant Professor

xyshen@eitech.edu.cn

Background Information: 

Xiaoyu Shen obtained his bachelor's degree in 2015 from Nanjing University. He obtained his PhD degree in 2021 at the Max Planck Institute for Informatics and Saarland University under the supervision of Professors Gerhard Weikum and Dietrich Klakow. In September 2020, he joined Amazon Alexa AI as a Machine Learning Scientist leading the Alexa Intelligent Customer Service Product Question-Answering project.


As of now, He has authored more than 40 papers in top conferences on natural language processing, cited over 2,100 times, with an h-index of 20 and an i10-index of 32. He has received several awards, including the Outstanding Graduate of Nanjing University, Outstanding Bachelor Thesis, and Outstanding Self-Financed Overseas Student awards. The annotation framework he designed for low-resource text generation received the Best Demo Paper Award at COLING 2020, and his paper exploring the effectiveness of weakly supervised learning received the Best Theme Paper Award at ACL 2023.


He also serve as a committee member for several top conferences and journals, including ACL, EMNLP, NAACL, AAAI, and TOIS. He is the Area Chair for ACL's Question Answering track and a topic Editor for "High-Performance Computing for AI in the Big Model Era."


Research Field:

The recent rapid development of LLMs have shown that the model performance can be steadily improved as the growing model and data size. Moving forward, my research interests primarily focus on the following three areas, exploring how to construct usable and reliable large models:

(1) Interpretability: Investigating the path from large models as probabilistic answer generators to logical reasoners.

(2) Cross-lingual Generalization: Enabling large models to transition from being experts centered around English to multilingual experts.

(3) Domain Specialization: Guiding large models to shift from the general domain to specific domains and exploring systematic methods for these models to rapidly acquire domain-specific knowledge.


Educational Background:

2015-2021PhD(NLP),Saarland University CS Department-Max Planck Institute for Informatics

2011-2015Bachelor(Software Engineering),Nanjing University Software Institute


Work Experience:

2020-2023: Machine Learning Scientist at Amazon Alexa AI


Academic Experience: (Optional)

2018/5-2018/9RIKEN AIP,Visiting Scholar

2016/9-2017/1: University of Tokyo/NII,Visiting Scholar 


Academic Part-time Jobs (Partial):

2022/11 until now: topic editor in “High-Performance Computing for AI in Big Model Era”

2023/1-2023/7: area chair in question answering track of ACL2023


Awards and Honors:

  • ACL 2023 Special Theme Paper Award

  • Outstanding Self-Financed Overseas Student awards in 2020

  • COLING 2020 Best Demo Paper Award

  • PhD Fellowship from Max Planck Society

  • Outstanding Bachelor Thesis in Nanjing University

  • Outstanding Graduate of Nanjing University

  • National Scholarship


Representative Works:

General Information

More than 40 papers in top AI conferences


Works Information and Citation Data

Google Scholar:

http://scholar.google.com/citations?hl=en&user=BWfPrE4AAAAJ


10 Representative Works (* refers to the corresponding author)

  1. Shen, Xiaoyu*, Akari Asai, Bill Byrne, and Adria De Gispert. "xPQA: Cross-Lingual Product Question Answering in 12 Languages." In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 5: Industry Track), pp. 103-115. 2023

  2. Dawei Zhu, Xiaoyu Shen*, Marius Mosbach, Andreas Stephan, and Dietrich Klakow. 2023. Weaker Than You Think: A Critical Look at Weakly Supervised Learning. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 14229–14253, Toronto, Canada. Association for Computational Linguistics.

  3. Tang, Ze, Xiaoyu Shen*, Chuanyi Li, Jidong Ge, Liguo Huang, Zhelin Zhu, and Bin Luo. "AST-trans: Code summarization with efficient tree-structured attention." In Proceedings of the 44th International Conference on Software Engineering, pp. 150-162. 2022.

  4. Su, Hui, Weiwei Shi, Xiaoyu Shen*, Zhou Xiao, Tuo Ji, Jiarui Fang, and Jie Zhou. "Rocbert: Robust chinese bert with multimodal contrastive pretraining." In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 921-931. 2022.

  5. Chang, Ernie, Xiaoyu Shen*, Hui-Syuan Yeh, and Vera Demberg. "On Training Instance Selection for Few-Shot Neural Text Generation." In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 8-13. 2021.

  6. Su, Hui, Xiaoyu Shen*, Zhou Xiao, Zheng Zhang, Ernie Chang, Cheng Zhang, Cheng Niu, and Jie Zhou. "Moviechats: Chat like humans in a closed domain." In Proceedings of the 2020 conference on empirical methods in natural language processing (EMNLP), pp. 6605-6619. 2020.

  7. Shen, Xiaoyu*, Ernie Chang, Hui Su, Cheng Niu, and Dietrich Klakow. "Neural Data-to-Text Generation via Jointly Learning the Segmentation and Correspondence." In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 7155-7165. 2020.

  8. Shen, Xiaoyu*, Yang Zhao, Hui Su, and Dietrich Klakow. "Improving latent alignment in text summarization by generalizing the pointer generator." In Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing (EMNLP-IJCNLP), pp. 3762-3773. 2019.

  9. Shen, Xiaoyu*, Jun Suzuki, Kentaro Inui, Hui Su, Dietrich Klakow, and Satoshi Sekine. "Select and Attend: Towards Controllable Content Selection in Text Generation." In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp. 579-590. 2019.

  10. Shen, Xiaoyu*, Hui Su, Wenjie Li, and Dietrich Klakow. "Nexus network: Connecting the preceding and the following in dialogue generation." In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 4316-4327. 2018.