Rollup merge of #69992 - kornelski:robots, r=steveklabnik
Block version-specific docs from search engines Latest stable, beta and nightly URLs remain accessible because their URLs don't start with a version number. Robots.txt uses simple path prefixes, so it's OK that the disallow rules aren't full directory paths. Direct links to old docs remain accessible to users, because robots.txt only affects crawlers. With this change old docs for specific old versions of Rust won't pop up in search results. This is good, because users won't be getting obsolete documentation by accident.
This commit is contained in:
commit
d34ec3309f
@ -1,21 +1,6 @@
|
||||
User-agent: *
|
||||
Disallow: /0.3/
|
||||
Disallow: /0.4/
|
||||
Disallow: /0.5/
|
||||
Disallow: /0.6/
|
||||
Disallow: /0.7/
|
||||
Disallow: /0.8/
|
||||
Disallow: /0.9/
|
||||
Disallow: /0.10/
|
||||
Disallow: /0.11.0/
|
||||
Disallow: /0.12.0/
|
||||
Disallow: /1.0.0-alpha/
|
||||
Disallow: /1.0.0-alpha.2/
|
||||
Disallow: /1.0.0-beta/
|
||||
Disallow: /1.0.0-beta.2/
|
||||
Disallow: /1.0.0-beta.3/
|
||||
Disallow: /1.0.0-beta.4/
|
||||
Disallow: /1.0.0-beta.5/
|
||||
Disallow: /1.
|
||||
Disallow: /0.
|
||||
Disallow: /book/first-edition/
|
||||
Disallow: /book/second-edition/
|
||||
Disallow: /stable/book/first-edition/
|
||||
|
Loading…
x
Reference in New Issue
Block a user