Of course, today's LLMs only appear to have theory of mind at first glance and fall apart under closer scrutiny. But if they can continue to become more and more accurate replicas of the real thing, I don't think it matters at all.
There's no way to know for sure that anyone other than yourself experiences consciousness. All you can do is judge for yourself that what they're describing matches closely enough with your own experiences that they're probably experiencing the same thing you are.
I think it does matter because it legitimizes a view of humans (and animals) that undervalues them. The causality of meaning arising from patterns of language rather than patterns of language arising from meaning follows the same inversion as society being more valuable than the humans in it. Bad things have happened when that belief becomes dominant.
> There's no way to know for sure that anyone other than yourself experiences consciousness. All you can do is judge for yourself that what they're describing matches closely enough with your own experiences that they're probably experiencing the same thing you are.
That judgment is not just based on the words other people use. It is based on knowing that other people's brains and minds have the same sort of semantic relationships to the rest of the world that yours do. And those relationships can be tested by checking to see if, for example, the other person uses the same words to refer to particular objects in the real world that you do, or if they react to particular real-world events in the same way that you do.
You can't even test any of this with an LLM because the LLM simply does not have the same kind of semantic relationships with the rest of the world that you do. It has no such relationships at all.
I'll dig up a source in a bit, but there is a critical period of development in which a child must be exposed to language, or they will fail to develop the very core skills that you're suggesting are innate abilities in a person regardless of their upbringing. This is exactly how you learned everything you know; your parents talked to you. Language grants you the ability to define concepts in the first place, without which you have no ability to recognise them as you have no language with which to think about them in the first place. So what specifically differentiates the way your brain learned to classify objects and words from the way a NN does? And what stops a NN from being able to develop concepts based on the relationship of those new definitions in the same way you do? IMO arguably it's just a matter of processing power and configuration of the network.
If you object to the term "mind", just leave it out and read "brain and mind" as just "brain". I'm a physicalist so I don't think the mind is something separate from the brain anyway, so it's all the same to me.
If a jellyfish was able to have a conversation with you where it credibly described what it's like to be conscious, would you reject it because it doesn't have a brain and therefore cannot have a mind in the same sense as you?
I don’t object to “mind”, but it’s famously unprovable whether other beings we presume to be conscious actually experience internal mental states. See also: qualia, p-zombies, the hard problem of consciousness.
> it’s famously unprovable whether other beings we presume to be conscious actually experience internal mental states
It's also famously unprovable that there is not an invisible dragon in my garage that cannot be detected by any means whatever, to use Carl Sagan's example. That doesn't mean such an idea is worth discussing or including in your thought processes.
> But if they can continue to become more and more accurate replicas of the real thing, I don't think it matters at all.
So, I suppose I'd ask: what does "matter" mean here? If you knew that everyone you loved had been destroyed and been replaced by exact replicas, would that matter?
> what is it about some people who seem to be constantly restless while others seem content to staying pretty much exactly where they were born
IMO the main factors would be your economic situation and your relationship with your family. People with little money and tight family ties probably wouldn't be keen to move very far, whereas someone with a bad or indifferent relationship to family and the means to head elsewhere would probably do so.
I don't think it's just this, I wonder if there is maybe a certain genetic factor as well. My anecdote is not data but I know that I left home at 17 with a tank of gas and some clothes and ended up living half a continent away in a totally new situation entirely on my own. I came from a completely stable and supportive home environment, I just wanted some adventure. I don't think I'm the only one with this experience or even particularly rare.
Human communities have always needed a majority of members who are focused on family, community, and stability, but they've needed a minority of members who are more interested in the horizon - these people are the traders, explorers, settlers, etc and I wouldn't be surprised if there is some reproductive advantage (maybe your offspring are more likely to have a diverse genetic composition or something) I wonder whether there might be a genetic contributor toward wanderlust.
I feel like relationship to family isn't as big of a limiter these days. I'm pretty close with my family and do miss them (they being on the other side of the world), but being able to video call them essentially whenever our waking times line up takes care of a lot of the issues. As a result of which I can still share any problems I have and get advice or similarly help them in dealing with their problems.
The few years my siblings spent half the world away from us prior to the proliferation of smartphones (and thus the ability for video calls to just happen whenever instead of being specifically organized) did seem to strongly weaken their relationship with the rest of the family though. They seemed to feel abandoned due to essentially having to grow up on their own (ie finish up college and start working) due to not being able to as easily share their issues.
Did you read the article? It's just a review of a book - the article itself doesn't have much to say either way about modern clubs, besides acknowledging the recent Pincher news.
> there is almost no one who would not understand, “It’s fourth and ten—we have to punt.”
One problem with heavily using sports, gaming, and war analogies to talk about strategy is that the discussion starts to make no sense at all to people who aren't interested in those things but who do care about strategy.
But I guess if you're the Modern War Institute at West Point, you know your audience pretty well, so ¯\_(ツ)_/¯
Well, the same is true if you try to talk about a strategy and people aren't familiar with its domain. ;)
Where else do you exercise an interest in strategy outside of games, sports, and war? Business, maybe? I'm just not sure if this criticism sits on solid ground. Let me know.
You forget politics, at all levels, formal or not.
I have an interest in strategy but don't care about sports. That said, I'd guess I'm in a minority and it's very hard to write for 100% of an audience.
It's not surprising that pixel art is easier to make in a raster editor than in a vector editor.
The 3rd image was probably made in Figma or Sketch though. It would probably take more time to build that in Photoshop than Figma, and the result would be harder to make changes to.
This is really cool, I'm excited to see more ML applied to design like this.
One project I wish someone would build is an ML-powered algorithm for perceptually even saturation, drawing on crowdsourced data to help pick colors that most people would perceive as being equally colorful
There's no way to know for sure that anyone other than yourself experiences consciousness. All you can do is judge for yourself that what they're describing matches closely enough with your own experiences that they're probably experiencing the same thing you are.