> Anthropic just came out and yolo'd the context window because they could afford to
I don’t think this is true at all. The reason CC is so good is that they’re very deliberate about what goes in the context. CC often spends ages reading 5 LOC snippets, but afterwards it only has relevant stuff in context.
Heard a lot of this context bs parroted all over HN, don't buy it. If simply increasing context size can solve problem, Gemini would be the best model for everything.
I don’t think this is true at all. The reason CC is so good is that they’re very deliberate about what goes in the context. CC often spends ages reading 5 LOC snippets, but afterwards it only has relevant stuff in context.