"Consciousness requires instincts in order to prioritize the endless streams of information. "
What if "instinct" is also just (pretrained) model weight?
The human brain is very complex and far from understood and definitely does NOT work like a LLM. But it likely shares some core concepts. Neuronal networks were inspired by brain synapses after all.
> What if "instinct" is also just (pretrained) model weight?
Sure - then it will take the same amount of energy to train as our reptilian and higher brains took. That means trillions of real life experiences over millions of years.
Not at all, it took life hundreds of millions of years to develop brains that could work with language, and took us tens of thousands of years to develop languages and writing and universal literacy. Now computers can print it, visually read it, speech-to-text transcribe it, write/create/generate it coherently, text-to-speech output it, translate between languages, rewrite in different styles, explain other writings, and that only took - well, roughly one human lifetime since computers became a thing.
What if "instinct" is also just (pretrained) model weight?
The human brain is very complex and far from understood and definitely does NOT work like a LLM. But it likely shares some core concepts. Neuronal networks were inspired by brain synapses after all.