Webmaster Tools: Using The Robots.txt Analysis Tool

Much like Google Search Console, Yandex Webmaster contains a simple and useful way to validate and test URLs against your robots.txt file.

Yandex’s version also allows you to view historical versions of the file and its changelog (for troubleshooting and identifying issues if they’re not noted elsewhere).

Yandex Webmaster – Robots.txt

From this screen, you can view historical versions from the drop-down (1), or download the file (2).

When validating as to whether or not URLs will be blocked or allowed by the website’s robots.txt file, there is a limit of 100 URLs per request.

The results are also straightforward, the red cross means blocked and the green checkmark means allowed.

Dan Taylor
Dan Taylor is an experienced SEO consultant and has worked with brands and companies on optimizing for Russia (and Yandex) for a number of years. Winner of the inaugural 2018 TechSEO Boost competition, webmaster at HreflangChecker.com and Sloth.Cloud, and founder of RussianSearchNews.com.