Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> In short, LLM's are not people.

Not really sure that is relevant in the context of therapy.

> LLM's do not possess shared experiences people have in order to potentially relate to others in therapy sessions as LLM's are not people.

Licensed therapists need not possess a lot of shared experiences to effectively help people.

> LLM's do not possess professional experience needed for successful therapy, such as knowing when to not say something as LLM's are not people.

Most people do not either. That an LLM is not a person doesn't seem particularly notable or relevant here.

Your comment is really saying:

"You need to be a person to have the skills/ability to do therapy"

That's a bold statement.



>> LLM's do not possess professional experience needed for successful therapy, such as knowing when to not say something as LLM's are not people.

> Most people do not either. That an LLM is not a person doesn't seem particularly notable or relevant here.

Of relevance I think: LLMs by their nature will often keep talking. They are functions that cannot return null. They have a hard time not using up tokens. Humans however can sit and listen and partake in reflection without using so many words. To use the words of the parent comment: trained humans have the pronounced ability to _not_ say something.


All it takes is a modulator that controls whether to let the LLM text through the proverbial mouth or not.

(Of course, finding the right time/occasion to modulate it is the real challenge).


> All it takes is a modulator that controls whether to let the LLM text through the proverbial mouth or not.

> (Of course, finding the right time/occasion to modulate it is the real challenge).

This is tantamount to saying:

  All you have to do to solve a NP-hard[0] problem is to
  make a polynomial solution.

  (Of course, proving P = NP is the real challenge).
0 - https://en.wikipedia.org/wiki/NP-hardness


Your analogy would be better if it were the construction of a heuristic.

GP seems to have a legitimate point though. The absence of a workable solution at present does not imply the impossibility of such existing in the not so distant future.


A lot of the comparisons I see revolve around comparing a perfect therapist to an LLM. This isn't the best comparison, because I've been to 4 different therapists over my life an only one of them actually helped me (2 of them spent most of the therapy telling me stories about themselves. These are licensed therapists!!) There are really bad therapists out there.

An LLM, especially chatgpt is like a friend who's on your side, who DOES encourage you and takes your perspective every time. I think this is still a step up from loneliness.

And a final point, ultimately an LLM is a statistical machine that takes the most likely response to your issues based on an insane amount of human data. Therefore it is very likely to actually make some pretty good calls about what it should respond, you might even say it takes the best (or most common) in humanity and reflects that to you. This also might be better than a therapist, who could easily just view your sitation through their own live's lense, which is suboptimal.


> Licensed therapists need not possess a lot of shared experiences to effectively help people.

Sure, they don't need to have shared experiences, but any licensed therapist has experiences in general. There's a difference between "My therapist has never experienced the stressful industry I work in" and "My therapist has never experienced pain, loneliness, fatigue, human connection, the passing of time, the basic experience of having a physical body, or what it feels like to be lied to, among other things, and they are incapable of ever doing so."

I expect if you had a therapist without some of those experiences, like a human who happened to be congenitally lacking in empathy, pain or fear, they would also be likely to give unhelpful or dangerous advice.


> You need to be a person to have the skills

Generally a non-person doesn’t have skills, it’s a pretty likely to be true statement even if made on a random subject.


Once again: The argument appears to be "LLMs cannot be therapists because they are LLMs." Circular logic.

> Generally a non-person doesn’t have skills,

A semantic argument isn't helpful. A chess grandmaster has a lot of skill. A computer doesn't (according to you). Yet, the computer can beat the grandmaster pretty much every time. Does it matter that the computer had no skill, and the grandmaster did?

That they don't have "skill" does not seem particularly notable in this context. It doesn't help answer "Is it possible to get better therapy from an LLM than from a licensed therapist?"


> Does it matter that the computer had no skill, and the grandmaster did?

Yes it does when pondering about the transferability of the skills mobilized to achieve the result (grandmaster status) to other domains.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: