X-Robots-Tag: When Search Engines Are Told Not to Index
BearAudit warns when a page sends noindex, nofollow, or similar directives. Learn what they mean, when they're intentional, and when to fix. Filter by x_robots_tag to review.
If a page sends an X-Robots-Tag header (or meta tag) with directives like noindex or nofollow, BearAudit reports a warning. That doesn't mean it's always wrong—but you want to be sure it's intentional.
What we check
We look at the X-Robots-Tag HTTP header. If it contains any of: noindex, none, nofollow, noarchive, nosnippet, noimageindex, we flag the page. These directives can:
- noindex — Ask search engines not to show the page in results.
- nofollow — Ask them not to follow links on the page (and sometimes not to pass link equity).
- none — Often treated like noindex + nofollow.
Other directives (noarchive, nosnippet, noimageindex) control how the page is cached, snippetized, or whether images are indexed.
When it's intentional
- Staging / admin / thank-you pages — You may want these out of the index. No change needed.
- Duplicate or low-value pages — Sometimes noindex is the right choice instead of canonical or removal.
When to fix
- Public content that should rank — If a normal article or product page is sending noindex (e.g. via a global header or misconfigured server), fix it so the page can be indexed.
- Accidental nofollow — If you meant to allow following, remove or adjust the directive.
Filter by x_robots_tag in BearAudit to see every page we flagged. Review each URL and either fix the header or confirm the directive is correct.
For a short definition, see What is X-Robots-Tag? in our glossary.