X-Robots-Tag Header Contains Noindex on Primary URL
The primary URL returns an X-Robots-Tag HTTP header with a noindex directive. This tells search engines not to index the page, even if the HTML meta robots tag allows indexing.
Use this article when
- You need deeper remediation guidance than the issue card can show.
- You want CMS-specific steps before handing the fix to a developer.
- You want a repeatable re-check path after shipping the change.
What this issue is
The primary URL returns an X-Robots-Tag HTTP header with a noindex directive. This tells search engines not to index the page, even if the HTML meta robots tag allows indexing.
Why it matters
The primary URL returns an X-Robots-Tag HTTP header with a noindex directive. This tells search engines not to index the page, even if the HTML meta robots tag allows indexing. This affects how clearly search engines understand the page and how persuasive it looks in search results.
How we detect it
- FreeSiteAudit flags this issue when the rule for CRAWL-XROBOTS-NOINDEX-001 fails and the page evidence points to Http headers.
- You can usually confirm this by checking the page source or the relevant page settings inside your CMS.
Evidence examples
How to fix it
- 1Remove the X-Robots-Tag: noindex header from the server configuration
- 2Check CDN, reverse proxy, or hosting platform settings for injected headers
- 3Coordinate with your hosting provider if you cannot modify response headers directly
How to re-check it
- Use curl -I to inspect response headers and confirm X-Robots-Tag no longer contains noindex
Related tools
This issue is best verified with the full FreeSiteAudit crawl rather than a single-point mini tool.