I mean you can argue all kinds of possibilities and in an abstract enough way anything can be true.
However, people who think these things have a soul and feelings in any way similar to us obviously have never built them. A transformer model is a few matrix multiplications that pattern match text, there's no entity in the system to even be subject to thoughts or feelings. They're capable of the same level of being, thought, or perception as a linear regression is. Data goes in, it's operated on, and data comes out.
> there's no entity in the system to even be subject to thoughts or feelings.
Can our brain be described mathematically? If not today, then ever?
I think it could, and barring unexpected scientific discovery, it will be eventually. Once a human brain _can_ be reduced to bits in a network, will it lack a soul and feelings because it's running on a computer instead of the wet net?
Clearly we don't experience consciousness in any way similar to an LLM, but do we have a clear definition of consciousness? Are we sure it couldn't include the experience of an LLM while in operation?
> Data goes in, it's operated on, and data comes out.
How is this fundamentally different than our own lived experience? We need inputs, we express outputs.
> I mean you can argue all kinds of possibilities and in an abstract enough way anything can be true.
However, people who think these things have a soul and feelings in any way similar to us obviously have never built them. A transformer model is a few matrix multiplications that pattern match text, there's no entity in the system to even be subject to thoughts or feelings. They're capable of the same level of being, thought, or perception as a linear regression is. Data goes in, it's operated on, and data comes out.