Skip to main content
Squarespace fix guide

Page Blocked by robots.txt on Squarespace

Your robots.txt file prevents Googlebot from crawling this page.

Issue ID: INDEX-BLOCKED-001Severity: criticalEffort: S

Why this matters

Your robots.txt file prevents Googlebot from crawling this page. This affects how clearly search engines understand the page and how persuasive it looks in search results.

Squarespace fix path

How to fix this on Squarespace

Review page visibility and indexing settings.

  1. 1Confirm the page is not hidden or password protected.
  2. 2Check any header code that adds noindex.
  3. 3Validate sitemap and crawlability after publishing.

After you fix it

  • Test URL with Google Search Console robots.txt tester

See every Squarespace issue on your site

This is one issue of many the full audit checks. Run a free scan to see what else might be holding your site back.

Run Free Audit