Hacker News new | past | comments | ask | show | jobs | submit login

So if language comes from the structure of the brain, what would stop us from simulating that structure to give a machine mastery of language? And specifically what would imply that a machine which had some of that structure would need to learn by interaction as the top level comment suggests?



Nothing would stop us from simulating human brain-like (or analogously powerful) structures to build a machine that genuinely understands natural language. I'm arguing that we can't just learn those structures by statistical optimization techniques though.

If it turns out that the easiest, or even only means of doing this is by emulating the human brain, then it is entirely possible that we inherit a whole new set of constraints and dependencies such that world-simulation and an emobdied mind are required to make such a system learn. If this turns out not to be the case, that there's some underlying principle of language we can emulate (the classic "airplanes don't fly like birds" argument) then it may be the case that text is enough. But that's in the presence of a new assumption, that our system came pre-equipped to learn language, and didn't manufacture an understanding from whole cloth. That the model weights were pre-initialized to specific values.


If there is an innate language structure in the brain then we know that it's possible to develop such a structure by statistical optimization, since this is exactly what evolution did, no?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: