As far as I can see, your issue with Chomsky has nothing to do with the performance of modern LLMs. You just reject all the data that generative linguists take to be crucially informative as to the grammatical structure of natural languages. You would hold the same view if LLMs had never been invented. So it is really a common case of AI and cognitive science talking entirely at cross purposes.
> English grammar is not fixed per se; it evolves with region (have you ever been to Singapore?) and time.
Sure, but this is not the case for the examples I gave. There aren’t dialects of English where (b) has the interpretation that GPT-4o thinks it can have. It’s no use trying to muddy the empirical waters in the case of completely clear judgments about what English sentences can and can’t mean.
> English grammar is not fixed per se; it evolves with region (have you ever been to Singapore?) and time.
Sure, but this is not the case for the examples I gave. There aren’t dialects of English where (b) has the interpretation that GPT-4o thinks it can have. It’s no use trying to muddy the empirical waters in the case of completely clear judgments about what English sentences can and can’t mean.