Cross-lingual transfer learning


Speaker: Sebastian Ruder (DeepMind)

Date and Time: 10am CT, October 16,2020

Abstract:

Research in natural language processing (NLP) has seen striking advances in recent years, mainly driven by large pretrained language models. However, most of these approaches still require fine-tuning on large labelled datasets, which has constrained their success to languages where such data is plentiful. In this talk, I will give an overview of approaches that transfer knowledge across languages and enable us to scale NLP models to more of the world's 7,000 languages. I will cover state-of-the-art methods, what we know about them so far, novel benchmarks in this area, and promising future research directions.

Bio:

Sebastian Ruder is a research scientist in the Language team at DeepMind, London. He completed his PhD in Natural Language Processing and Deep Learning at the Insight Research Centre for Data Analytics, while working as a research scientist at Dublin-based text analytics startup AYLIEN. Previously, he studied Computational Linguistics at the University of Heidelberg, Germany and at Trinity College, Dublin. He is interested in transfer learning for NLP and making ML and NLP more accessible.