We had our non-profit website drained out of bandwidth and site closed temporarily (!!) from our hosting deal because of Amazon bot aggressively crawling like ?page=21454 ... etc.
Gladly Siteground restored our site without any repercussions as it was not our fault. Added Amazon bot into robots.txt after that one.
Don't like how things are right now. Is a tarpit the solution? Or better laws? Would they stop the chinese bots? Should they even? I don't know.
> We had our non-profit website drained out of bandwidth
There is a number of sites which are having issues with scrapers (AI and others) generating so much traffic that transit providers are informing them that their fees will go up with the next contract renewal, if the traffic is not reduced. It's just very hard for the individual sites to do much about it, as most of the traffic stems from AWS, GCP or Azure IP ranges.
I think that's a terrible idea, especially with ISP monopolies that love gouging their customers. They have a demonstrable history of markups well beyond their means.
And I hope you're pricing this highly. I don't know about you, but I would absolutely notice $.03 a site on my bill, just from my human browsing.
In fact, I feel like this strategy would further put the Internet in the hands of the aggregators as that's the one site you know you can get information from, so long term that cost becomes a rounding error for them as people are funneled to their AI as their memberships are cheaper than accessing the rest of the web.
Gladly Siteground restored our site without any repercussions as it was not our fault. Added Amazon bot into robots.txt after that one.
Don't like how things are right now. Is a tarpit the solution? Or better laws? Would they stop the chinese bots? Should they even? I don't know.