Yes that’s my main point - if you accept the first one, then you should accept the second one (though some people might find the second so absurd as to reject the first).
> Imagine if instead of an LLM the billions of people instead simulated a human brain. Would that human brain experience consciousness? Of course it would, otherwise they're not simulating the whole brain.
However, I don’t really buy “of course it would,” or in another words the materialist premise - maybe yes, maybe no, but I don’t think there’s anything definitive on the matter of materialism in philosophy of mind. as much as I wish I was fully a materialist, I can never fully internalize how sentience can uh emerge from matter… in other words, to some extent I feel that my own sentience is fundamentally incompatible with everything I know about science, which uh sucks, because I definitely don’t believe in dualism!
It would certainly with sufficient accuracy honestly say to you that it's conscious and believes it whole heartily, but in practice it would need to a priori be able describe external sense data, as it's not separate necessarily from the experiences, which intrinsically requires you to compute in the world itself otherwise it would only be able to compute on, in a way it's like having edge compute at the skins edge. The range of qualia available at each moment will be distinct to each experiencer with the senses available, and there likely will be some overlap in interpretation based on your computing substrate.
We in a way can articulate the underlying chemputation of the universe mediated through our senses, reflection and language, turn a piece off (as it is often non continuous) and the quality of the experience changes.
But do you believe in something constructive? Do you agree with Searle that computers calculate? But then numbers and calculation are immaterial things that emerge from matter?
> Imagine if instead of an LLM the billions of people instead simulated a human brain. Would that human brain experience consciousness? Of course it would, otherwise they're not simulating the whole brain.
However, I don’t really buy “of course it would,” or in another words the materialist premise - maybe yes, maybe no, but I don’t think there’s anything definitive on the matter of materialism in philosophy of mind. as much as I wish I was fully a materialist, I can never fully internalize how sentience can uh emerge from matter… in other words, to some extent I feel that my own sentience is fundamentally incompatible with everything I know about science, which uh sucks, because I definitely don’t believe in dualism!