article thumbnail

The Seven Trends in Machine Translation for 2019

NLP People

Hundreds of researchers, students, recruiters, and business professionals came to Brussels this November to learn about recent advances, and share their own findings, in computational linguistics and Natural Language Processing (NLP). So, what’s new in the world of machine translation and what can we expect in 2019?

BERT 52
article thumbnail

Explosion in 2019: Our Year in Review

Explosion

As 2019 draws to a close and we step into the 2020s, we thought we’d take a look back at the year and all we’ve accomplished. was released – our first major upgrade to Prodigy for 2019. Sep 15: Adriane Boyd makes up the second spaCy developer team hire in 2019. Got a question? ✨ Feb 18: Finally in February, Prodigy v1.7.0

NLP 52
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

2022: We reviewed this year’s AI breakthroughs

Applied Data Science

In our review of 2019 we talked a lot about reinforcement learning and Generative Adversarial Networks (GANs), in 2020 we focused on Natural Language Processing (NLP) and algorithmic bias, in 202 1 Transformers stole the spotlight. Just wait until you hear what happened in 2022. What happened?

article thumbnail

The State of Multilingual AI

Sebastian Ruder

Research models such as BERT and T5 have become much more accessible while the latest generation of language and multi-modal models are demonstrating increasingly powerful capabilities. In Findings of the Association for Computational Linguistics: ACL 2022 (pp. Proceedings of NAACL 2019, Tutorial Abstracts. Winata, G.

article thumbnail

The State of Transfer Learning in NLP

Sebastian Ruder

This post expands on the NAACL 2019 tutorial on Transfer Learning in NLP. 2019 ) of recent years. A taxonomy that highlights the variations can be seen below: A taxonomy for transfer learning in NLP ( Ruder, 2019 ). Update 16.10.2020: Added Chinese and Spanish translations. 2017 ) and pretrained language models ( Peters et al.,

NLP 75
article thumbnail

Reward Isn't Free: Supervising Robot Learning with Language and Video from the Web

The Stanford AI Lab Blog

Conference of the North American Chapter of the Association for Computational Linguistics. ↩ Devlin, J., BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. RoBERTa: A Robustly Optimized BERT Pretraining Approach. RoBERTa: A Robustly Optimized BERT Pretraining Approach.

article thumbnail

Multi-domain Multilingual Question Answering

Sebastian Ruder

Reading Comprehension assumes a gold paragraph is provided Standard approaches for reading comprehension build on pre-trained models such as BERT. Using BERT for reading comprehension involves fine-tuning it to predict a) whether a question is answerable and b) whether each token is the start and end of an answer span.

BERT 52