Update robots.txt to disallow all web crawlers
This commit is contained in:
@@ -1,3 +1,3 @@
|
||||
# allow crawling everything by default
|
||||
# disallow all
|
||||
User-agent: *
|
||||
Disallow:
|
||||
Disallow: /
|
||||
|
||||
Reference in New Issue
Block a user