Revert "add robots.txt to specify doc versions to appear in search engines" #4584
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Reverts #3291, which added our own
robots.txt
. I discovered that readthedocs will automatically generate an appropriaterobots.txt
based on which versions are activated and/or hidden. We can manage the visible versions fromreadthedocs
.The
robots.txt
file is at https://circuitpython.readthedocs.io/en/robots.txt.See https://docs.readthedocs.io/en/stable/versions.html#states:
To manage visible versions, we use https://readthedocs.org/projects/circuitpython/versions/, and edit the individual versions. I just added


6.2.x
, removed previous6.x.x
versions, and marked older versions as "hidden". Click on the Edit buttons to change the Active/Hidden state:Example: