On the surface, that sounds like a reasonable position to take. ("Cowley proposes an alternative: that language acquisition involves culturally determined language skills, apprehended by a biologically determined faculty that responds to them. In other words, he proposes that each extreme is right in what it affirms, but wrong in what it denies. Both cultural diversity of language, and a learning instinct, can be affirmed; neither need be denied.")
GPT's ability to fool intelligent people into thinking that it is "intelligent" itself seems like a powerful argument that language, more than anything else, is what makes humans capable of higher thought. Language is all GPT has. (Well, that and a huge-ass cultural database.)
Intelligence is one of those areas in which, once you fake it well enough, you've effectively made it. Another 10x will be enough to tie the game against an average human player.
There's a really easy, yet unconscionably horrible experiment we could perform to test the assumption that we're preprogrammed with any sort of knowledge.
Take a baby and stick it in a room. Let it grow up with absolutely no stimulation whatsoever. They are given food and that's about it. What do you think it can demonstrate knowledge of by the time it reaches 5? 10? 15?
All behavior is learned behavior. People talk about sucking and breathing and walking horses and what not, but babies do have to learn how to latch and how to feed. Now, they can work it out themselves. But quick acquisition of a skill does not mean the skill already existed.
Not to mention it's a far cry from sucking to language. Or knowing what a person is. Or who a person is.