Remove Computational Linguistics Remove Computer Vision Remove Deep Learning Remove NLP
article thumbnail

Modular Deep Learning

Sebastian Ruder

This post gives a brief overview of modularity in deep learning. For modular fine-tuning for NLP, check out our EMNLP 2022 tutorial. Fuelled by scaling laws, state-of-the-art models in machine learning have been growing larger and larger. We give an in-depth overview of modularity in our survey on Modular Deep Learning.

article thumbnail

NLP Landscape: Germany (Industry & Meetups)

NLP People

Are you looking to study or work in the field of NLP? For this series, NLP People will be taking a closer look at the NLP education & development landscape in different parts of the world, including the best sites for job-seekers and where you can go for the leading NLP-related education programs on offer.

NLP 52
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

ML and NLP Research Highlights of 2021

Sebastian Ruder

2021) 2021 saw many exciting advances in machine learning (ML) and natural language processing (NLP). In computer vision, supervised pre-trained models such as Vision Transformer [2] have been scaled up [3] and self-supervised pre-trained models have started to match their performance [4]. Why is it important?  

NLP 52
article thumbnail

All Languages Are NOT Created (Tokenized) Equal

Topbots

Language Disparity in Natural Language Processing This digital divide in natural language processing (NLP) is an active area of research. 70% of research papers published in a computational linguistics conference only evaluated English.[ Square One Bias in NLP: Towards a Multi-Dimensional Exploration of the Research Manifold.

article thumbnail

The State of Multilingual AI

Sebastian Ruder

At the same time, a wave of NLP startups has started to put this technology to practical use. I will be focusing on topics related to natural language processing (NLP) and African languages as these are the domains I am most familiar with. This post is partially based on a keynote I gave at the Deep Learning Indaba 2022.

article thumbnail

Overcoming The Limitations Of Large Language Models

Topbots

Making the transition from classical language generation to recognising and responding to specific communicative intents is an important step to achieve better acceptance of user-facing NLP systems, especially in Conversational AI. Association for Computational Linguistics. [2] Association for Computational Linguistics. [4]

article thumbnail

Reward Isn't Free: Supervising Robot Learning with Language and Video from the Web

The Stanford AI Lab Blog

Deep learning has enabled improvements in the capabilities of robots on a range of problems such as grasping 1 and locomotion 2 in recent years. Indeed, this recipe of massive, diverse datasets combined with scalable offline learning algorithms (e.g. Deep contextualized word representations. Neumann, M., Gardner, M.,