Making it harder for bots usually means that it drives up the cost for the bots to operate; so if they need to run in a headless browser to get around the anti-bot measures it might mean that it takes, for example, 1.5 seconds to execute a request as compared to the 0.1 seconds it would without them in place.
On top of that 1.5 seconds is also that there is a much larger CPU and memory cost from having to run that browser compared to a simple direct HTTP request which is near negligible.
So while you'll never truly defeat a sufficiently motivated actor, you may be able to drive their costs up high enough that it makes it difficult to enter the space or difficult to turn a profit if they're so inclined.
I understand the argument. You can't have perfect defense and speedbumps are quite effective. I'm not trying to disagree with that.
But it does not seem like the solution is effective at mitigating bots. Presumably bots are going a different route considering how prolific they are, which warrants another solution. If they are going through this route then it certainly isn't effective either and also warrants another solution.
It seems like this obscurification requires a fair amount of work, especially since you need to frequently update the code to rescramble it. Added complexity also increases risks for bugs and vulnerabilities, which ultimately undermine the whole endeavor.
I'm trying to understand why this level of effort is worth the cost. (Other than nefarious reasons. Those ones are rather obvious)
On top of that 1.5 seconds is also that there is a much larger CPU and memory cost from having to run that browser compared to a simple direct HTTP request which is near negligible.
So while you'll never truly defeat a sufficiently motivated actor, you may be able to drive their costs up high enough that it makes it difficult to enter the space or difficult to turn a profit if they're so inclined.