Yandex Publishes YaLM 100B

It’s the Largest GPT-Like Neural Network in Open Source.

Yandex has published a new, large-scale transformer-based language model in open source, claiming it to be the largest GPT-like neural network to be open-source for the developer community.

In their Medium article, authored by Mikhail Khrushchev, a Senior Developer in YaLM, Yandex has said:

We’ve been using YaLM family of language models in our Alice voice assistant and Yandex Search for more than a year now. Today, we have made our largest ever YaLM model, which leverages 100 billion parameters, available for free.

The article, including how to access YaLM 100B, how it was trained, and the experiences behind it can be read here.

 

Dan Taylor
Dan Taylor is an experienced SEO consultant and has worked with brands and companies on optimizing for Russia (and Yandex) for a number of years. Winner of the inaugural 2018 TechSEO Boost competition, webmaster at HreflangChecker.com and Sloth.Cloud, and founder of RussianSearchNews.com.