Transfer learning in NLP
Over the last two years, the field of Natural Language Processing (NLP) has witnessed the emergence of transfer learning methods and architectures which significantly improved upon the state-of-the-art on pretty much every NLP tasks.
The wide availability and ease of integration of these transfer learning models are strong indicators that these methods will become a common tool in the NLP landscape as well as a major research direction.
In this talk, I'll present a quick overview of modern transfer learning methods in NLP and review examples and case studies on how these models can be integrated and adapted in downstream NLP tasks, focusing on open-source solutions.
- Co-founder and Chief Science Officer of Hugging Face. His team is on a mission to advance and democratize NLP for everyone
- Prior to Hugging Face, Thomas gained a Ph.D. in quantum physics, and later a law degree
- He worked as a European Patent Attorney for 5 years, assisting a portfolio of startups and big companies to build and defend their Intellectual Property assets
- Twitter, Github, Medium, Web page