article thumbnail

Commonsense Reasoning for Natural Language Processing

Probably Approximately a Scientific Blog

This long-overdue blog post is based on the Commonsense Tutorial taught by Maarten Sap, Antoine Bosselut, Yejin Choi, Dan Roth, and myself at ACL 2020. Figure 1: adversarial examples in computer vision (left) and natural language processing tasks (right). Image credit: Lin et al. Image credit: Lin et al.

article thumbnail

10 ML & NLP Research Highlights of 2019

Sebastian Ruder

This post gathers ten ML and NLP research directions that I found exciting and impactful in 2019. 2019 ) and other variants. In biology, Transformer language models have been pretrained on protein sequences ( Rives et al., 2019 ), MoCo ( He et al., 2019 ), MoCo ( He et al., 2019 ) and domains ( Desai et al.,

NLP 52
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Explosion in 2019: Our Year in Review

Explosion

As 2019 draws to a close and we step into the 2020s, we thought we’d take a look back at the year and all we’ve accomplished. was released – our first major upgrade to Prodigy for 2019. Jul 29: Then it was really nice to see Ines featured as the PyDev of the Week on the Mouse vs. Python blog at the end of the month.

NLP 52
article thumbnail

The Seven Trends in Machine Translation for 2019

NLP People

Hundreds of researchers, students, recruiters, and business professionals came to Brussels this November to learn about recent advances, and share their own findings, in computational linguistics and Natural Language Processing (NLP). So, what’s new in the world of machine translation and what can we expect in 2019?

BERT 52
article thumbnail

spaCy IRL 2019: 2 days of NLP in Berlin

Explosion

We were pleased to invite the spaCy community and other folks working on Natural Language Processing to Berlin this summer for a small and intimate event.

article thumbnail

AI trends in 2023: Graph Neural Networks

AssemblyAI

Top 50 keywords in submitted research papers at ICLR 2022 ( source ) A recent bibliometric study systematically analysed this research trend, revealing an exponential growth of published research involving GNNs, with a striking +447% average annual increase in the period 2017-2019.

article thumbnail

A Gentle Introduction to GPTs

Mlearning.ai

You don’t need to have a PhD to understand the billion parameter language model GPT is a general-purpose natural language processing model that revolutionized the landscape of AI. GPT-3 is a autoregressive language model created by OpenAI, released in 2020 . What is GPT-3? Hope you like it!