How do we know our own 'processes' could not be modeled that way?
I think this is the point of the Turing test. At some point, you can't tell if a system is "thinking" or just crunching numbers. And it doesn't matter.
How do we know our own 'processes' could not be modeled that way?
I think this is the point of the Turing test. At some point, you can't tell if a system is "thinking" or just crunching numbers. And it doesn't matter.