A robots.txt tester lets you verify that your robots.txt rules correctly allow or block specific URL paths for any crawler. Paste your robots.txt, choose a user-agent like Googlebot, enter a URL path, and instantly see the result with the matching rule highlighted — so you know exactly why a page is crawlable or not.

Robots.txt Content

Test Parameters

Enter the path without the domain (e.g. /blog/post)