I’m not framing it as an absolute. I’m generalising. I accept edge cases exist but my point stands for the majority use cases.
Also you shouldn’t really be using robots.txt any more unless it’s a simple “Disallow: /“ because ironically bad actors use it to decide what URIs to hit. If you have content up you want to limit access to, you’re much better off putting it behind an auth layer (even if it’s just simple HTTP auth + fail2ban)
Also you shouldn’t really be using robots.txt any more unless it’s a simple “Disallow: /“ because ironically bad actors use it to decide what URIs to hit. If you have content up you want to limit access to, you’re much better off putting it behind an auth layer (even if it’s just simple HTTP auth + fail2ban)