Update robots.txt to disallow all web crawlers
Some checks failed
CI / check (push) Failing after 40s
CI / build (push) Has been skipped

This commit is contained in:
2025-12-24 17:04:32 -06:00
parent f7becdb26e
commit 044830f5cb

View File

@@ -1,3 +1,3 @@
# allow crawling everything by default
# disallow all
User-agent: *
Disallow:
Disallow: /