Re: discounting… Given that OpenAI is burning billions and making trivial revenue in comparison, the cost per token is probably going to skyrocket when Sam runs out of BS to con the next investor. I’m guessing the only way that token cost doesn’t explode is if Claude ends up in Amazon’s hands and OpenAI is Microsoft’s. Then Amazon, Google, and MS can subsidize if they want. But as standalone businesses, they can’t make it at current token prices. IMHO
There are usage limits, but the argument is that unless you're writing and modifying large swathes of code in YOLO mode, you don't hit them. At least for what I would call a small and tedious task. I'm thinking "write a docstring", "add type annotations", "write a single unit test for this case", "fill in this function". For a good prompt these are often solved in <10 interactions. Especially when combined with scoped rules that are pulled in on demand to guide output.