robots.txt is a simple but dangerous file. One mistaken Disallow line can knock your entire site out of the index. This tool checks both syntax and actual behaviour for each user-agent — what Googlebot can crawl, what Yandexbot can, and where they diverge.
What we check
- Syntax: correct placement of User-agent, Allow, Disallow, Sitemap, Crawl-delay.
- Conflicting rules — when Allow and Disallow argue over the same path.
- Case sensitivity: robots.txt paths are case-sensitive — /Page and /page are different.
- Validity of Sitemap links.
- Googlebot vs Yandexbot differences — Yandex has specific directives like Clean-param.