Fetch and analyze any website's robots.txt file. Test URL rules, view Sitemap directives, check crawl delays, and see which bots are blocked or allowed.
Robots.txt Tester
Enter a URL to check if it is allowed or blocked by the robots.txt rules.
Advertisement – 728×90 Mid – 728×90
About Robots.txt Tester
Frequently Asked Questions – Robots.txt Tester
What is robots.txt?
robots.txt is a file at the root of your website that instructs search engine crawlers which pages they can and cannot access. It is a guideline, not an enforcement mechanism.
Does blocking in robots.txt remove pages from Google?
No. Blocking a URL in robots.txt prevents Googlebot from crawling it but does not remove it from search results. Use noindex meta tags to remove pages from search results.