Google open-sources MT5, a multilingual model trained on over 101 languages

Not to be outdone by Facebook and Microsoft, both of whom detailed cutting-edge machine learning language algorithms in late October, Google this week open-sourced a model called MT5 that the company claims achieves state-of-the-art results on a range of English natural processing tasks. MT5, a multilingual variant of Google’s T5 model that was pretrained on a dataset covering 101 languages, contains between 300 million and 13 billion parameters (variables internal to the model used to make predictions) and ostensibly has enough capacity to learn over 100 languages without significant “interference” effects.

Read More