chore: Add noindex
meta tag to v8 doc pages
#914
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Closes #910
Re: v8 docs being indexed (often higher than v10).
The meta tag is the correct course, rather than say
Disallow
in therobots.txt
, asDisallow
only blocks the crawler from discovering the page while navigating our site. While this often does have an effect, external links to the v8 docs could keep them ranking highly, which would be an issue. What we actually want to do is signal that these pages shouldn't be indexed, not hide it from discovery, hence, the meta tag.https://developers.google.com/search/docs/crawling-indexing/robots/robots-faq?visit_id=637994089979808237-3634207106&rd=1#h17
What I am unsure of is if we need to do client-side addition/removal of this tag upon navigation. I haven't quite been able to get a definitive answer on that, does anyone know?Edit: Upon doing some more reading I don't think that will be an issue.