>Why does it need 100x the dataset? Sentient creatures, including humans, manage to figure stuff out from as little as a single datapoint.
Human brains are not quite blank slates at birth. They're predisposed to interpret and quickly learn from the sort of inputs that their ancestors were exposed to. That is to say, the brain, which learns, is also the result of a learning process. If a mad scientist rewired your brain to your senses such that its inputs were completely scrambled and then deposited you on an alien planet, it might take your brain several lifetimes to restructure itself enough to interpret this novel input.
Also consider that a human brain that is able to figure stuff from as little as a single datapoint is normally exposed to at least 4 years of massive and socially "directed" multimodal data patterns.
As many cases of feral childs have shown, those humans not "trained" in their first years of life will never be able to harness language and therefore will never be able to display human-level intelligence.
> those humans not "trained" in their first years of life will never be able to harness language
I'm not an expert in the field but I'd always understood this effect was thought to be (probably) due to human "neuro-plasticity" <--(possibly not the correct technical term), in only the first years of life being genetically adapted to have some traits necessary for efficient human language development which are not available (or much harder) later in life.
If correct, this has implications for how we structure and train synthetic networks of human-like neurons to produce human-like behaviors. The interesting part, at least to me, is it doesn't necessarily mean synthetic networks of human-like neurons can never be structured and trained to produce very human-like minds. This poses the fascinating possibility that actual human minds, including all the cool stuff like emotions, qualia and even "what it feels like to be a human mind" might be emergent phenomena of much simpler systems than some previously imagined. I think this is one of the more uncomfortable ideas some philosophers of mind like Daniel Dennett propose. In short, nascent AI research appears to support the idea human minds and consciousness may not be so magically unique. (or at least AI research hasn't so far disproved the idea)
> If a mad scientist rewired your brain to your senses such that its inputs were completely scrambled and then deposited you on an alien planet, it might take your brain several lifetimes to restructure itself enough to interpret this novel input.
Based on anecdotal psychedelic experiences I believe you.
It's kind of amazing how quickly our brains effectively reboot into this reality from scrambled states. It's so familiar, associating with conscious existence feels like gravity. Like falling in a dream, reality always catches you at the bottom.
What if you woke up tomorrow and nothing made any sense?
>Based on anecdotal psychedelic experiences I believe you.
I've never done it, but I imagine it would be more akin to a dissociative trip, only extremely unpleasant. Imagine each of senses (including pain, balance, proprioception, etc.) giving you random input.
Human brains are not quite blank slates at birth. They're predisposed to interpret and quickly learn from the sort of inputs that their ancestors were exposed to. That is to say, the brain, which learns, is also the result of a learning process. If a mad scientist rewired your brain to your senses such that its inputs were completely scrambled and then deposited you on an alien planet, it might take your brain several lifetimes to restructure itself enough to interpret this novel input.