Hacker News new | past | comments | ask | show | jobs | submit login

"While we are far from understanding how the mind works, most philosophers and scientists agree that your mind is an emergent property of your body. In particular, your body’s connectome. Your connectome is the comprehensive network of neural connections in your brain and nervous system. Today your connectome is biological. "

This is a pretty speculative thesis. It's not at all clear that everything relevant to the mind is found in the connections rather than the particular biochemical processes of the brain. It's a very reductionist view that drastically underestimates the biological complexity of even individual cells. There's a good book, Wetware: A Computer in Every Living Cell, by Dennis Bray going into detail on how much functionality and physical processes are at work even in the most simplest cells that is routinely ignored by these analogies of the brain to a digital computer.

There is this extreme, and I would argue unscientific bias towards treating the mind as something that's recreatable in a digital system probably because it enables this science-fiction speculation and dreams of immortality of people living in the cloud.




I’ve posed this claim to dozens of neuroscientists. If you consider the connectome just the static connections then you might be right. If you include the dynamics of the brain (the biochemical processes) as part of the connectome then most neuroscientists would agree that is sufficient to produce the emergent property of mind. The honest answer is we don’t know yet. That said, it’s likely not necessary to model every atom’s interaction with one another so there must be a level of abstraction sufficient enough to emulate a mind. Our foundation is trying to identify what is the minimal level of abstraction necessary to emulate a mind.


In support of the requirement for high-fidelity (atom-for-atom) modeling is the notion that an evolved computer would converge toward behaviors that supervene on specifics of the host environment. If porting a binary to another CPU architecture is tough, how easy will it be to port a mind to a simulated simple physics? How many edge cases will it have to get right to even run at all? If brains are hacks designed over millions of generations to surf overlapping fitness functions, it makes sense they'd find implementation (real physics) dependent optimizations that compound in ways which fall apart in toy physics. That's not to say we can't add cool peripherals.

ps even with atom-for-atom modeling, how do you know the behavior doesn't depend on relations which are not computable? If physics ranges over the reals, some of those edge cases might be hard to find with a simulator.


I can hit myself in the head and I don't lose my train of thought, instantly lose consciousness, or die. If consciousness relied on the precise positions of individual atoms (as far as that makes sense with moving particles) it would be way more fragile than we've observed it to be. The fact that your brain is resilient to being knocked around a bit is evidence towards the underlying mind being at least slightly higher level than where strong quantum effects live and also fairly redundant.


I agree, but I think there is a case to be made that there is important state separate from just which cells connect to which and how strongly, but is also more coarse grained than single atoms floating around.

The cytoskeleton may be found out to have a role to play. The number and locations of ion pumps. Or epigenetic changes in clusters of brain cells.


If you hit your head hard enough all those things will happen though.


On the other hand, the brain is famously warm and wet. There's a limit to how much local state the brain can practically use to compute, given how messy it is.


>If physics ranges over the reals

I thought this was ruled out by https://en.wikipedia.org/wiki/Bekenstein_bound


That's my understanding as well. Quantum states encode finite information (despite their dynamics lying on the reals; the practical consequence is just that state transitions and information flow are smooth).

The formal discussion around this is largely centered on the Church-Turing thesis:

https://en.wikipedia.org/wiki/Church%E2%80%93Turing_thesis

Essentially stating (in a certain interpretation) that any physical process can be simulated by a Turing machine (with no mention of efficiency). This seems to be the case, although I'm not sure we have a convincing proof from quantum field theory yet (note that quantum process can be simulated in classical Turing machines, although with an exponential cost).


> If porting a binary to another CPU architecture is tough, how easy will it be to port a mind to a simulated simple physics?

It will be very difficult, but we shouldn’t underestimate what is possible decades from now. As an analogy, consider when the Nintendo Entertainment System (NES) came out in the 1980s. Did anyone ever imagine it could be fully emulated in JavaScript in a browser [1]? Certainly not since those technologies hadn’t been invented yet.

[1] https://jsnes.org/


What's JavaScript? What's a browser?


Hopefully the brain doesn't rely on undefined behavior.


What if we find that the gut, which has 100 million nerve cells, also plays a part in the emergent property of mind?

https://www.sciencemag.org/news/2018/09/your-gut-directly-co...

"In a petri dish, enteroendocrine cells reached out to vagal neurons and formed synaptic connections with each other. The cells even gushed out glutamate, a neurotransmitter involved in smell and taste, which the vagal neurons picked up on within 100 milliseconds—faster than an eyeblink."


That is one reason why I was very careful to name this the Mind Emulation Foundation and not the Brain Emulation Foundation. I also use the word 'body' instead of 'brain' throughout and define a connectome as: the comprehensive network of neural connections in your brain and nervous system.


If that were true, then quadriplegics would have cognitive issues, as would those who have their vagus nerve severed. Those people don't suffer from impaired cognition or drastic personality changes, so we can be sure that the nerves in the gut are not important for brain emulation.

Also human brains have an average of 86 billion neurons, so emulating an extra 100 million cells (0.1%) would be trivial in comparison.



It’s nice to see someone has consciousness all figured out.

But seriously, do you know of any studies that show no changes to mental state or capacity or personality or memory or any of the other things that compose “the mind” in such people?

I can’t believe anyone has done such studies yet. And just because you don’t see changes in such people, does not mean there haven’t been minuscule but measurable changes.


I feel like it just isn't that interesting if there are "miniscule but measureable changes" beyond a platonic ideal of self. Removing my minor back pain, or if I stopped drinking coffee, or a hundred other things would have larger-than-minuscule change on my personality but it's still "me".


Good point :)


86 billion neurons and ~1 trillion other cells, which interact with those neurons in non-trivial ways.


People who have large parts of their gut removed surgically don't lose their mind.


"once you have a comprehensive human connectome, there is still the challenge of emulating it digitally"

What would you do if Christof Koch is right and consciousness can not be computed?


His hypothesis if I understand it correctly, consciousness is the core part of existence so you would need to bring hardware into the world that not only is able to simulate the whole reality but to exist in it. Like property of physical reality and composition. That dude is a physicist believing in one single Universe, which probably also applies to most neuroscientists.

I'm not completely alien to the thought that someday we will get a digital companion that would be able to build a simulation of you in silico, but that simulation wouldn't be you.

This foundation sounds like complete science fiction, MRI really? We already have destructive approach in some countries - it's called euthanasia.


I hear the term 'emergent property' bandied about in relation to the mind as if using it somehow explains anything. It says nothing more than mind exists, somehow, yet we have no clue about its nature.

Scientists and philosophers agreeing on something means nothing as they have agreed on utter bunk before. The short of it is that we know little about the mind and have no idea how to even start expanding on the little we know.


> Scientists and philosophers agreeing on something means nothing as they have agreed on utter bunk before.

That is an extremely cynical position to take; one could use it to dismiss anything, even things obviously true. For instance, "I don't believe the Sun is powered by fusion -- nobody has ever gone there to take a sample. Sure, they claim to have all sorts of indirect evidence, and there is 99.9999% consensus on it, but scientists have backed things before that were utter bunk."

> It says nothing more than mind exists,

It makes a much stronger claim than that. There are many people who believe that consciousness exists outside the mind, and the brain is a kind of consciousness receiver, akin to a radio receiver, that picks up the signal and relays it to the body. The claim of emergent behavior is an explicit rejection that that mystical explanation that is compelling to many people.

> The short of it is that we know little about the mind and have no idea how to even start expanding on the little we know.

This is entirely at odds with reality. Is it your position that brain researchers haven't learned anything over the past 10, 20, 30 years? Clearly they have, so obviously they do have ideas about how to expand on that knowledge.


"Emergent" is description of its nature. An alternative is "fundamental property". If you study an emergent property, you don't need to find its structure as a fundamental object, and this sets general direction of research.


I think we may be aligned in thinking that the evolution of the complexity within our cells is a greater marvel than the evolution of complex intelligence building on the base of the eukaryotic cell.

But perhaps we don't need to model cells, we could use real cells for that. If the connectome is the fragile short lived structure, we could model that in software, with the cellular level modelled by a vast array of vivisected tardigrades.

Ok probably not tardigrades, probably small replaceable batteries of human neural tissue.

But I do like that phase about future humans being 'bootstapped from tardigrades"


Your post started like a legitimate well-thought out criticism based on science (even if a bit cherry-picking, assuming that the author meant only connection between the cells, when they could have also meant the various physical connections between different parts and not only electrical impuplses between whole neurons).

But then it started throwing unsubstantiated claims in there along with the legitimate criticisms.

How does

> unscientific bias towards treating the mind as something that's recreatable in a digital system

follow from

> It's not at all clear that everything relevant to the mind is found in the connections rather than the particular biochemical processes of the brain.

Even if the consciousness arises in a more profound way from the biochemical processes inside cells than from electrical connections between the cells, what is stopping us from recreationg those biochemical processes in a digital system as well? You know there are simulation equations for chemical reactions and complex biochemical systems, right? If this will be found useful, they will just be emulated the same ways the DL neurons are simplified and emulated now.


> It's not at all clear that everything relevant to the mind is found in the connections rather than the particular biochemical processes

I wouldn't expect anyone to consider the connectome to be absent the processes inside each of the individual neurons that are connected. I consider this to mean just that it's an emergent property of the collection working in concert. After all, everything is just connections all the way down, even deep inside individual cells.


> everything is just connections all the way down

Is that true? Could the "biochemical processes" also include relationships between cells?


Correct, in order to emulate a mind we might need to include that as well so that is in scope.


Indeed. We humans largely create devices that function either through calculation or through physical reaction, relying on the underlying rules of the universe to "do the math" of, say, launching a cannonball and having it follow a consistent arc. The brain combines both at almost every level. It may be fundamentally impossible to emulate a human personality equal to a real one without a physics simulation of a human brain and its chemistry.

A dragonfly brain takes the input from thirty thousand visual receptor cells and uses it to track prey movement using only sixteen neurons. Could we do the same using an equal volume of transistors?


No one is saying a neuron is a one to one equivalent with a transistor. That behavior does seem like it's possible to emulate with many transistors, however.


Was just talking about quantum cognition and memristors (in context to GIT) a few days ago: https://news.ycombinator.com/item?id=24317768

Quantum cognition: https://en.wikipedia.org/wiki/Quantum_cognition

Memristor: https://en.wikipedia.org/wiki/Memristor

It may yet be possible to sufficiently functionally emulate the mind with (orders of magnitude more) transistors. Though, is it necessary to emulate e.g. autonomic functions? Do we consider the immune system to be part of the mind (and gut)?

Perhaps there's something like an amplituhedron - or some happenstance correspondence - that will enable more efficient simulation of quantum systems on classical silicon pending orders of magnitude increases in coherence and also error rate in whichever computation medium.

For abstract formalisms (which do incorporate transistors as a computation medium sufficient for certain tasks), is there a more comprehensive set than Constructor Theory?

Constructor theory: https://en.wikipedia.org/wiki/Constructor_theory

Amplituhedron: https://en.wikipedia.org/wiki/Amplituhedron

What is the universe using our brains to compute? Is abstract reasoning even necessary for this job?

Something worth emulating: Critical reasoning. https://en.wikipedia.org/wiki/Critical_reasoning


Nor did I. I asked if we could do the same function with an equal volume. Moore's law is dead. We're not going to scale performance forever. What good is an emulated human brain if it's the size of a building and takes a power plant to operate?


> There is this extreme, and I would argue unscientific bias towards treating the mind as something that's recreatable in a digital system probably because it enables this science-fiction speculation and dreams of immortality of people living in the cloud.

I would argue differently: perhaps your point of view that's unscientific (or at least I don't see any scientific path to concluding mind emulation is impossible!).

Say there are complex processes going inside each neuronal cell. It should not be impossible to simulate those as well with arbitrary fidelity.

I think what is practically questionable is mind uploading -- we don't know if reading all this information is feasible, and it should be even more remote that reading it non-destructively would be possible. If it is feasible, then there's the question of cost.

I don't see why with a large enough computer we couldn't emulate a mind (or any other existing system). It is possible a quantum computer may be necessary, but there's still little evidence of entangled processes in the brain (indeed we strongly expect no large-scale entanglement due to high temperatures).

I think it's quite radical, and difficult to understand, the possibility of mind emulation and its impact in our society. The rights of the individuals, all the possibilities like making copies, backups, modifications, pausing a simulation, running an emulated mind faster than real minds, expanding memory, and much more. This goes all completely outside usual human experience (live out your consciousness, with occasional pauses in sleep and sedation, until you die and it stops). I predict it'll take very long until we come to gripes with all consequences of it.

It should be noted almost certainly Agent-AGI (AGI that acts like a person) will come before trying to emulate exactly a human mind. AGI has its own very interesting ethical questions. I think the main point we need to start taking in is that those beings will be conscious like us, thus we need to give them rights, take care with their experience, make sure they aren't exploited (by say simulating many AGI-individuals in terrible conditions). Again there is much outside of human experience, because they will have much more freedom. An AGI doesn't necessarily need to feel pain. Any real system might be able to override pain signals if convenient (something we can't do); but still, if it has motivators, I think it will be able to "suffer" (when things it cares about go bad).

One of the ways we will tackle this questions is through fiction. It's been ongoing, from what I know at least since William Gibson's Neuromancer there have been simulated minds (and robots existed long before, although they are not always treated as equals, or the possibility of different cognitive natures is explored).




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: