Skip to main content
SEO fix guide

robots.txt Blocks Render-Critical Assets

The robots.txt file disallows crawling of CSS, JavaScript, or other assets required to render the page. This prevents search engines from seeing the page as users do, which can harm indexing and ranking.

Issue ID: CRAWL-ROBOTS-RENDER-001
Severity: major
Impact: High
Effort: M

Use this article when

  • You need deeper remediation guidance than the issue card can show.
  • You want CMS-specific steps before handing the fix to a developer.
  • You want a repeatable re-check path after shipping the change.
Re-run full audit

What this issue is

The robots.txt file disallows crawling of CSS, JavaScript, or other assets required to render the page. This prevents search engines from seeing the page as users do, which can harm indexing and ranking.

Why it matters

The robots.txt file disallows crawling of CSS, JavaScript, or other assets required to render the page. This prevents search engines from seeing the page as users do, which can harm indexing and ranking. This affects how clearly search engines understand the page and how persuasive it looks in search results.

How we detect it

  • FreeSiteAudit flags this issue when the rule for CRAWL-ROBOTS-RENDER-001 fails and the page evidence points to Http headers.
  • You can usually confirm this by checking the page source or the relevant page settings inside your CMS.

Evidence examples

Check the affected page source, rendered output, or relevant CMS setting to confirm the missing or incorrect element.

How to fix it

  1. 1Remove Disallow rules that block CSS and JS files needed for rendering
  2. 2Allow Googlebot access to all resources required for page rendering
  3. 3Use Google Search Console URL Inspection to verify rendering is not blocked

How to re-check it

  • Use GSC URL Inspection "View Tested Page" to confirm the rendered page matches the live page

Related tools

This issue is best verified with the full FreeSiteAudit crawl rather than a single-point mini tool.