If you have ever opened Google Search Console and seen warnings about pages being blocked, you are not alone. These messages can look terrifying — like your site is about to fall off the internet. (It is not. Probably.)
But here is the thing: not all blocked pages are bad. And not all annotations blocking warnings mean something has gone wrong. Understanding exactly what is happening, and why, is the difference between making smart SEO decisions and making panicked ones at 11 pm on a Sunday.
Let us walk through all of it — properly, with real facts, no fluff.
What Are Annotations in Google Search Console?
Google Search Console uses annotations in two distinct ways, and it is worth understanding both before diving into the "blocking" part.
System annotations are notes that Google itself generates. According to Google's official documentation, these automatically appear on your performance charts to flag data processing issues, algorithm changes, or notable events that might explain sudden shifts in your traffic data.
Custom annotations are a feature Google officially launched in November 2025 after months of testing. Google's Search Central Blog describes this as a way to add your own contextual notes directly to performance charts — think of it as a personal changelog for your SEO work. You can right-click on any date in the Performance report and add a note of up to 120 characters.
Custom annotations are useful for tracking things like:
- Site migrations or template changes
- Publishing a major new content cluster
- Hiring an SEO agency or deploying a new plugin
- External events that affected search demand (Black Friday, holidays, product launches)
You can have up to 200 annotations per property, and any annotation older than 500 days gets automatically deleted. You cannot edit them either — so get your note right first time. (Source: Google Search Console Help)
So What Does "Annotations Blocking" Actually Mean?
This phrase shows up in a couple of different contexts, and it helps to separate them clearly.
1. Pages Blocked From Crawling (Shown as System Annotations)
When Google cannot crawl or index a page on your site, Search Console logs this. These blocked states often appear as system annotations or status flags in the Page Indexing report. Common causes include:
| Block Type | What It Means | Should You Fix It? |
|---|---|---|
| Blocked by robots.txt | Your robots.txt file is telling Google not to crawl this URL | Only if it should be public |
| Noindex tag | A meta tag on the page tells Google not to index it | Only if you want it indexed |
| 401 Unauthorized | Page requires login; Googlebot is locked out | Yes, if it should be public |
| 403 Forbidden | Server is actively refusing Googlebot access | Yes, if it should be public |
| 4xx Other errors | Various client-side errors blocking the crawl | Yes, for important pages |
| 5xx Server errors | Server-side problems preventing access | Contact your host |
Source: Google Search Console Help & SEOTesting
Here is the nuance people often miss: blocked does not always equal broken. If you have deliberately blocked your admin panel, checkout pages, or staging URLs from Google, that is exactly what you want. The annotation is just Google confirming it respects your rules.
The problem only appears when pages you want to rank are getting blocked by accident.
2. Annotations Blocking in the Context of Data Access
In some setups — particularly larger teams, agencies, or multi-property accounts — certain users may have restricted access levels. According to Google's documentation, restricted users can view annotations but cannot add or delete them. Full users and owners have complete control.
If someone on your team says they cannot add annotations, they likely have restricted user access — not a technical problem. This is solved by adjusting their permissions in the Search Console settings.
Indexed But Blocked by robots.txt — The Confusing One
There is a Search Console status that genuinely baffles people: "Indexed, though blocked by robots.txt." How can a page be indexed if it was blocked?
The answer is simpler than it sounds. Google may have seen links pointing to that URL from other pages — internal links, external backlinks, or a sitemap entry — and indexed the URL based on that reference alone, without actually crawling the page content. SEOTesting explains that this commonly happened on older WordPress sites before version 5.3, where the "discourage search engines" setting would create this exact conflict.
So if you see this status, ask yourself two questions:
- Do I want this page indexed? If yes, remove the robots.txt block and request indexing.
- Do I want it hidden? Use a noindex meta tag instead — robots.txt alone is not reliable for deindexing.
For more on how Google handles technical configurations, explore our technology section on BigWriteHook for further reading.
How Annotations Help You Understand Blocking Events
Here is where the two concepts genuinely connect. Custom annotations in Search Console are not just a digital sticky note — they are a tool for understanding why your traffic or indexing status changed.
Imagine this: you look at your performance chart and notice clicks dropped sharply on 14 February. You panic. You post in an SEO forum. But if you had added an annotation on that date that said "Migrated to new CMS, robots.txt temporarily blocking all URLs," you would know immediately what happened.
Search Engine Journal makes this point well: annotations create a shared change log inside the performance reports you already use. For teams or agencies managing multiple sites, this replaces the need for separate spreadsheets or project management tools to track SEO events.
Real-world use cases for annotations include:
- Marking the date a robots.txt block was removed or applied
- Logging when a noindex tag was added to a section of your site
- Recording a Google Core Update date (Google marks some, but not all)
- Noting when a 403 fix was deployed to unblock Googlebot
- Flagging a CDN or hosting change that may have introduced server errors
"Data without context is often worthless. With this update, Google finally gives us the opportunity to write our 'SEO history' directly within the data tool." — Christian Ott, SEO Kreativ
Common Causes of Accidental Annotations Blocking
Most blocking issues in Search Console are accidental. Here are the situations that trip people up most often.
robots.txt Mistakes
A single misplaced disallow rule can block your entire site from Google. This happens more than you'd think — especially after a CMS migration or when a developer adds a global disallow during a staging phase and forgets to revert it before launch.
The fix is to check your robots.txt file at yourdomain.com/robots.txt and look for rules that are too broad, like Disallow: /. Use Google's robots.txt validator in Search Console to test specific URLs against your current rules.
WordPress "Discourage Search Engines" Setting
WordPress has a setting under Settings → Reading called "Discourage search engines from indexing this site." This was designed for staging environments. But developers sometimes launch a live site with this still ticked. It generates a robots.txt rule that blocks Googlebot, and traffic collapses. The fix is one checkbox. The drama is enormous.
Security Plugins and Firewall Rules Blocking Googlebot
Some security plugins and WAF (Web Application Firewall) configurations block certain IP ranges or user agents. Googlebot can get caught in these filters. SEOTesting notes that servers can allow normal users through while blocking bots like Googlebot — making this issue invisible to the site owner in regular browsing.
Check your firewall logs if you see a spike in 403 Forbidden errors in Search Console.
Password-Protected Pages Listed in Sitemaps
If you have added a URL to your sitemap that requires authentication to access, Googlebot will try to crawl it, get a 401 response, and flag it. The solution is either to make the page public, or to remove it from your sitemap entirely.
How to Check Blocked Pages in Google Search Console
Finding blocked pages is straightforward once you know where to look.
- Log in to Google Search Console
- On the left sidebar, click Pages (under the Indexing section)
- You will see a breakdown of why pages are not indexed — including all blocking reasons
- Click any category to see the specific URLs affected
- Use the URL Inspection Tool to test individual pages and see exactly what Googlebot encounters
For adding or viewing annotations, navigate to the Performance report, right-click on any date on the chart, and select "Add annotation." It is quick once you find it.
Does Annotations Blocking Affect SEO Rankings?
This is the question everyone actually wants answered.
Annotations themselves — the notes you write in Search Console — have zero effect on your rankings. Google is very clear on this. They are internal documentation tools. Writing "fixed broken links" on a date does not make your links rank better. Sadly.
However, the underlying issues that annotations often document absolutely do affect rankings. A page blocked by robots.txt cannot be crawled and therefore cannot rank. A 403 error means Googlebot cannot access your content. A noindex tag removes a page from Google's index entirely.
So the answer is: annotations themselves are neutral. The events they track can have significant SEO impact.
For a deeper dive into how technical SEO affects your site's performance, check out our technology articles and business insights sections here at BigWriteHook.
Quick Summary Table: Annotations Blocking At a Glance
| Term | What It Refers To | Action Needed? |
|---|---|---|
| System annotations | Auto-generated Google notes on data processing issues | Review; no manual action usually |
| Custom annotations | Your own notes added to performance charts | Add them proactively |
| Blocked by robots.txt | Googlebot told not to crawl a URL | Fix if page should be public |
| Noindex tag present | Page excluded from Google index via meta tag | Fix if page should be indexed |
| 403 / 401 error | Server refusing Googlebot access | Fix immediately for public pages |
| Restricted user access | User cannot add/delete annotations | Upgrade their permissions |
Frequently Asked Questions
No. Google does not currently support editing annotations. You can delete an existing one and add a new one. Annotations older than 500 days are also automatically removed.
You can add up to 200 annotations per Search Console property. Each note has a 120-character limit, so keep them brief and clear.
Not immediately. After you remove the block, you need to request re-indexing through the URL Inspection Tool. Google also needs to recrawl and process your pages, which can take days to weeks depending on your site's crawl budget and authority.
Yes. Annotations are shared across the property. Anyone with owner or full user access can add, view, or delete them. Restricted users can view but not add or delete annotations.
No. A robots.txt block tells Googlebot not to visit a URL. A 403 error means Googlebot tried to visit the URL and the server refused access. Both result in pages not being indexed, but they need different fixes.
Final Thoughts
Annotations blocking is not one thing — it is a shorthand for several related situations involving Google Search Console, page crawlability, and how you document your SEO work.
The most important takeaway is this: most blocking is either intentional (and working correctly) or an accident that is easy to fix once you know where to look. The new custom annotations feature Google rolled out in late 2025 gives you a proper, built-in way to keep track of what changed and when — which is genuinely useful if you use it consistently.
Think of it less like a warning system and more like a diary for your website. The blocks are just entries that say "on this date, Google couldn't get in." Your job is to decide whether the door should be open or closed — and to make sure it is the one you intended.
For more helpful reads on SEO, technology, and digital tools, explore our Technology section or get in touch with us at BigWriteHook.
Sources: Google Search Console Help — Annotations | Google Search Central Blog — Custom Chart Annotations | Search Engine Land | Search Engine Journal | SEOTesting
If you have ever opened Google Search Console and seen warnings about pages being blocked, you are not alone. These messages can look terrifying — like your site is about to fall off the internet. (It is not. Probably.)
But here is the thing: not all blocked pages are bad. And not all annotations blocking warnings mean something has gone wrong. Understanding exactly what is happening, and why, is the difference between making smart SEO decisions and making panicked ones at 11 pm on a Sunday.
Let us walk through all of it — properly, with real facts, no fluff.
What Are Annotations in Google Search Console?
Google Search Console uses annotations in two distinct ways, and it is worth understanding both before diving into the "blocking" part.
System annotations are notes that Google itself generates. According to Google's official documentation, these automatically appear on your performance charts to flag data processing issues, algorithm changes, or notable events that might explain sudden shifts in your traffic data.
Custom annotations are a feature Google officially launched in November 2025 after months of testing. Google's Search Central Blog describes this as a way to add your own contextual notes directly to performance charts — think of it as a personal changelog for your SEO work. You can right-click on any date in the Performance report and add a note of up to 120 characters.
Custom annotations are useful for tracking things like:
- Site migrations or template changes
- Publishing a major new content cluster
- Hiring an SEO agency or deploying a new plugin
- External events that affected search demand (Black Friday, holidays, product launches)
You can have up to 200 annotations per property, and any annotation older than 500 days gets automatically deleted. You cannot edit them either — so get your note right first time. (Source: Google Search Console Help)
So What Does "Annotations Blocking" Actually Mean?
This phrase shows up in a couple of different contexts, and it helps to separate them clearly.
1. Pages Blocked From Crawling (Shown as System Annotations)
When Google cannot crawl or index a page on your site, Search Console logs this. These blocked states often appear as system annotations or status flags in the Page Indexing report. Common causes include:
| Block Type | What It Means | Should You Fix It? |
|---|---|---|
| Blocked by robots.txt | Your robots.txt file is telling Google not to crawl this URL | Only if it should be public |
| Noindex tag | A meta tag on the page tells Google not to index it | Only if you want it indexed |
| 401 Unauthorized | Page requires login; Googlebot is locked out | Yes, if it should be public |
| 403 Forbidden | Server is actively refusing Googlebot access | Yes, if it should be public |
| 4xx Other errors | Various client-side errors blocking the crawl | Yes, for important pages |
| 5xx Server errors | Server-side problems preventing access | Contact your host |
Source: Google Search Console Help & SEOTesting
Here is the nuance people often miss: blocked does not always equal broken. If you have deliberately blocked your admin panel, checkout pages, or staging URLs from Google, that is exactly what you want. The annotation is just Google confirming it respects your rules.
The problem only appears when pages you want to rank are getting blocked by accident.
2. Annotations Blocking in the Context of Data Access
In some setups — particularly larger teams, agencies, or multi-property accounts — certain users may have restricted access levels. According to Google's documentation, restricted users can view annotations but cannot add or delete them. Full users and owners have complete control.
If someone on your team says they cannot add annotations, they likely have restricted user access — not a technical problem. This is solved by adjusting their permissions in the Search Console settings.
Indexed But Blocked by robots.txt — The Confusing One
There is a Search Console status that genuinely baffles people: "Indexed, though blocked by robots.txt." How can a page be indexed if it was blocked?
The answer is simpler than it sounds. Google may have seen links pointing to that URL from other pages — internal links, external backlinks, or a sitemap entry — and indexed the URL based on that reference alone, without actually crawling the page content. SEOTesting explains that this commonly happened on older WordPress sites before version 5.3, where the "discourage search engines" setting would create this exact conflict.
So if you see this status, ask yourself two questions:
- Do I want this page indexed? If yes, remove the robots.txt block and request indexing.
- Do I want it hidden? Use a noindex meta tag instead — robots.txt alone is not reliable for deindexing.
For more on how Google handles technical configurations, explore our technology section on BigWriteHook for further reading.
How Annotations Help You Understand Blocking Events
Here is where the two concepts genuinely connect. Custom annotations in Search Console are not just a digital sticky note — they are a tool for understanding why your traffic or indexing status changed.
Imagine this: you look at your performance chart and notice clicks dropped sharply on 14 February. You panic. You post in an SEO forum. But if you had added an annotation on that date that said "Migrated to new CMS, robots.txt temporarily blocking all URLs," you would know immediately what happened.
Search Engine Journal makes this point well: annotations create a shared change log inside the performance reports you already use. For teams or agencies managing multiple sites, this replaces the need for separate spreadsheets or project management tools to track SEO events.
Real-world use cases for annotations include:
- Marking the date a robots.txt block was removed or applied
- Logging when a noindex tag was added to a section of your site
- Recording a Google Core Update date (Google marks some, but not all)
- Noting when a 403 fix was deployed to unblock Googlebot
- Flagging a CDN or hosting change that may have introduced server errors
"Data without context is often worthless. With this update, Google finally gives us the opportunity to write our 'SEO history' directly within the data tool." — Christian Ott, SEO Kreativ
Common Causes of Accidental Annotations Blocking
Most blocking issues in Search Console are accidental. Here are the situations that trip people up most often.
robots.txt Mistakes
A single misplaced disallow rule can block your entire site from Google. This happens more than you'd think — especially after a CMS migration or when a developer adds a global disallow during a staging phase and forgets to revert it before launch.
The fix is to check your robots.txt file at yourdomain.com/robots.txt and look for rules that are too broad, like Disallow: /. Use Google's robots.txt validator in Search Console to test specific URLs against your current rules.
WordPress "Discourage Search Engines" Setting
WordPress has a setting under Settings → Reading called "Discourage search engines from indexing this site." This was designed for staging environments. But developers sometimes launch a live site with this still ticked. It generates a robots.txt rule that blocks Googlebot, and traffic collapses. The fix is one checkbox. The drama is enormous.
Security Plugins and Firewall Rules Blocking Googlebot
Some security plugins and WAF (Web Application Firewall) configurations block certain IP ranges or user agents. Googlebot can get caught in these filters. SEOTesting notes that servers can allow normal users through while blocking bots like Googlebot — making this issue invisible to the site owner in regular browsing.
Check your firewall logs if you see a spike in 403 Forbidden errors in Search Console.
Password-Protected Pages Listed in Sitemaps
If you have added a URL to your sitemap that requires authentication to access, Googlebot will try to crawl it, get a 401 response, and flag it. The solution is either to make the page public, or to remove it from your sitemap entirely.
How to Check Blocked Pages in Google Search Console
Finding blocked pages is straightforward once you know where to look.
- Log in to Google Search Console
- On the left sidebar, click Pages (under the Indexing section)
- You will see a breakdown of why pages are not indexed — including all blocking reasons
- Click any category to see the specific URLs affected
- Use the URL Inspection Tool to test individual pages and see exactly what Googlebot encounters
For adding or viewing annotations, navigate to the Performance report, right-click on any date on the chart, and select "Add annotation." It is quick once you find it.
Does Annotations Blocking Affect SEO Rankings?
This is the question everyone actually wants answered.
Annotations themselves — the notes you write in Search Console — have zero effect on your rankings. Google is very clear on this. They are internal documentation tools. Writing "fixed broken links" on a date does not make your links rank better. Sadly.
However, the underlying issues that annotations often document absolutely do affect rankings. A page blocked by robots.txt cannot be crawled and therefore cannot rank. A 403 error means Googlebot cannot access your content. A noindex tag removes a page from Google's index entirely.
So the answer is: annotations themselves are neutral. The events they track can have significant SEO impact.
For a deeper dive into how technical SEO affects your site's performance, check out our technology articles and business insights sections here at BigWriteHook.
Quick Summary Table: Annotations Blocking At a Glance
| Term | What It Refers To | Action Needed? |
|---|---|---|
| System annotations | Auto-generated Google notes on data processing issues | Review; no manual action usually |
| Custom annotations | Your own notes added to performance charts | Add them proactively |
| Blocked by robots.txt | Googlebot told not to crawl a URL | Fix if page should be public |
| Noindex tag present | Page excluded from Google index via meta tag | Fix if page should be indexed |
| 403 / 401 error | Server refusing Googlebot access | Fix immediately for public pages |
| Restricted user access | User cannot add/delete annotations | Upgrade their permissions |
Frequently Asked Questions
No. Google does not currently support editing annotations. You can delete an existing one and add a new one. Annotations older than 500 days are also automatically removed.
You can add up to 200 annotations per Search Console property. Each note has a 120-character limit, so keep them brief and clear.
Not immediately. After you remove the block, you need to request re-indexing through the URL Inspection Tool. Google also needs to recrawl and process your pages, which can take days to weeks depending on your site's crawl budget and authority.
Yes. Annotations are shared across the property. Anyone with owner or full user access can add, view, or delete them. Restricted users can view but not add or delete annotations.
No. A robots.txt block tells Googlebot not to visit a URL. A 403 error means Googlebot tried to visit the URL and the server refused access. Both result in pages not being indexed, but they need different fixes.
Final Thoughts
Annotations blocking is not one thing — it is a shorthand for several related situations involving Google Search Console, page crawlability, and how you document your SEO work.
The most important takeaway is this: most blocking is either intentional (and working correctly) or an accident that is easy to fix once you know where to look. The new custom annotations feature Google rolled out in late 2025 gives you a proper, built-in way to keep track of what changed and when — which is genuinely useful if you use it consistently.
Think of it less like a warning system and more like a diary for your website. The blocks are just entries that say "on this date, Google couldn't get in." Your job is to decide whether the door should be open or closed — and to make sure it is the one you intended.
For more helpful reads on SEO, technology, and digital tools, explore our Technology section or get in touch with us at BigWriteHook.
Sources: Google Search Console Help — Annotations | Google Search Central Blog — Custom Chart Annotations | Search Engine Land | Search Engine Journal | SEOTesting
