Bel ons (078-2032777)

E-mail ons

Bert brought big steps for NLP

28 januari 2020

Geplaatst door Marcel Dijkgraaf

Techhead / Trendwatcher

Progress of artificial intelligence in 2019

We have complained a lot about #NLP, the #AI that delivers language understanding, and rightfully so.

We have laughed at the blatant errors of #chatbots and assistants.

But the steps in 2018 and 2019 will quickly make us forget these early day problems. One of these steps was the release of Google’s BERT, which stands for Bidirectional Encoder Representations from Transformers… they indeed did their best to come with a catchy name after ELMO 😄

BERT was the best NLP model build so far and it is huge…. It was trained on more than 3 billion words and has 340 million parameters. It is based on a number of ideas from the AI society under which Transformers.

These Transformers are the interesting bit, a transformer learns the relation of a word with another word in a sentence. So it has no problem understanding that the same word can mean totally different things in different contexts.

And the good AI news kept continuing, after BERT came cross language XLM/mBERT from Facebook, that helps solving issues with NLP for small languages. And not long after that BERT was outperformed by XLNet. So just like computing power, NLP will exponentially improve the coming years. It already became really good at generating texts, so good that it is hard to spot the difference. You can give it a try with Adam Geitgey’s Fake News quiz, it got me fooled.

So I think the NLP future is exciting! and we are looking forward to all the great new developments of 2020 and the services we can build on top of it.

info@mobilewater.nl
078-2032777

Ceresstraat 13
4811 CA Breda

Ook zoveel te vertellen? 👋

Spannende ontwikkelingen! Nieuwe ideeën? Waar ben jij de laatste tijd mee bezig geweest? Wij zijn heel benieuwd.

Plan een meeting

Stuur e-mail

Chat met ons

nl_NLNederlands