Commit e3973234 authored by Stephan Großberndt's avatar Stephan Großberndt
Browse files

[TASK] Improve robots.txt rules

* Deny DotBot and SemrushBot
* Disallow several directories
* Change Crawl-Delay to 30 seconds
parent c0184313
......@@ -74,11 +74,19 @@ Disallow: /
User-agent: Nutch
Disallow: /
# bots found on review.typo3.org
User-agent: DotBot
Disallow: /
User-agent: SemrushBot
Disallow: /
# All allowed bots are ask to skip /changes as these return JSON code content only, impact performance and are of no value to search engines
User-agent: *
Disallow: /changes/
Disallow: /Documentation/
Disallow: /login/
Disallow: /plugins/
# All bots are asked to pause at least 5 seconds between two visits
Crawl-Delay: 5
# All bots are asked to pause at least 30 seconds between two visits
Crawl-Delay: 30
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment