Skip to main content

Free Robots.txt Checker

Check if your robots.txt is correctly configured and not blocking important content.

SEO tools

How does your site compare?

95% of well-optimised small business sites have a robots.txt that references their sitemap without blocking key content.

Based on technical SEO audits of small business websites with strong organic performance.
Your robots.txt exists, doesn't block important content, and includes a Sitemap directive.

How to fix this

Create or fix your robots.txt file at the domain root with proper directives.

  1. 1Visit yoursite.com/robots.txt to check the current file.
  2. 2Ensure it does NOT contain "Disallow: /" (this blocks everything).
  3. 3Add a Sitemap line: Sitemap: https://yoursite.com/sitemap.xml
  4. 4Use Google Search Console's robots.txt tester to validate.

Quick tips by platform

WordPress: Yoast SEO → Tools → File editor → edit robots.txt directly.
Wix: Settings → SEO → SEO Tools → Robots.txt Editor.
Shopify: Shopify manages robots.txt automatically. Use the robots.txt.liquid template for customisation.

Frequently Asked Questions

What does robots.txt do?
It tells search engine crawlers which parts of your site they can and cannot access. It's a text file at your domain root (e.g., yoursite.com/robots.txt).
Can robots.txt block my site from Google?
Yes. A misconfigured robots.txt with "Disallow: /" will prevent all search engines from crawling your site.
Is robots.txt required?
Not strictly, but it's best practice. Without one, search engines will crawl everything, which may include admin pages or duplicate content you'd rather hide.

Related Tools

Fix guides that use this tool

Browse fix library

Want the full picture?

This tool checks one thing. Our full audit runs all checks across SEO, speed, security, and more, free.

Run Full Audit - Free