-
-
Notifications
You must be signed in to change notification settings - Fork 3.7k
Closed
Labels
Needed: design decisionA core team decision is requiredA core team decision is required
Description
What's the problem this feature will solve?
We are trying to prevent search engines from indexing older pages of our documentation. Although we want them to be reachable, by marking them as "active" and "hidden", we don't want them to be indexed.
Describe the solution you'd like
For each "hidden" version, readthedocs should add a <meta name="robots" content="noindex, nofollow" />
Alternative solutions
We have followed https://docs.readthedocs.io/en/latest/faq.html#how-can-i-avoid-search-results-having-a-deprecated-version-of-my-docs but this is not enough as it's not possible to use robots.txt to prevent indexing.
Additional context
We could add robots meta tags by adding to our Documentation/conf.py:
html_context = {
...
'meta_robots': '<meta name="robots" content="noindex, nofollow" />',
}
but we would like to avoid doing this as it requires us from pushing commits to all branches that we have currently marked as hidden.
Metadata
Metadata
Assignees
Labels
Needed: design decisionA core team decision is requiredA core team decision is required