Primer: attention, transformers and the new architecture of language

Sign up for our newsletter, Artificiality, to get our latest AI insights delivered directly to your email inbox.

Natural language is one of the most important challenges in AI and the frontier has moved rapidly over the past few years. This primer will help you get to grips with the basic concepts in a (mostly) non-technical way. If you haven't already read our primer on representing data, you might want to do that first as it explains how AI networks can be used to find patterns in data.

Language is unique. There are many shades of meaning for a given word that only emerge based on its situation in a sentence and its relationship with other words. Meaning happens as language is actually used. The challenge for AI is to capture these relationships mathematically.

Traditional approaches to natural language processing assumes that word meaning is stable across sentences. This is far from the case so researchers are on a quest to invent techniques that are able to include more relevant information about a word.

A technique that has, up until recently been state-of-the-art, is to use an A...

- - - - -

Hi!

The rest of this post is available for our Pro Members. Please login if you're a member. And, if you're not, subscribe now to get access to all of our Pro Member content!

Subscribe here.

Email us with any questions.

Thanks!

 

About Sonder Scheme: We are on a mission to help humans win in the age of AI by making AI design easier, more inclusive, more ethical and more human. Through our workshops, learning journeys and the Sonder Scheme Studio, we enable people to use design thinking-based tools to create innovative, effective and ethical human-machine systems. Both offline and online, Sonder Scheme empowers companies around the world to design human-centered AI.

Share on email
Share on facebook
Share on linkedin
Share on twitter