Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> then where would I talk about that?

Alert: with ChatGPT you're not talking to anyone. It's not a human being.



Which is perfect. In Australia, I tried to talk to Lifeline about wanting to commit suicide. They called the police on me (no, they are not a confidential service). I then found myself in a very bad situation. ChatGPT can't be much worse.


I’m sorry Lifeline did that to you.

I believe that if society actually wants people to open up about their problems and seek help, it can’t pull this sort of shit on them.


Sadly, that's exactly what society does. I can only speak for Australia, but if you have suicidal thoughts then it is a very bad idea to talk to any service even partially funded by the government. ChatGPT, in the absence of strong safeguards of absolute privacy, is as good as anything.

As an example: Mark Cross, who was on the board of SANE Australia, stated that whenever people were put into seclusion would be reviewed. We know now that this was never the case, and still is not being carried out fully. This lauded psychiatrist didn't seem to even known what was happening in his own unit or EDs.

Listen to it here:

https://www.abc.net.au/listen/programs/conversations/convers...


except in US where this info will be sold and you won’t be able to get life insurance, job etc


Lucky I'm not in the U.S. then.


I didn’t write who would I talk to, I said where

A very intentional word choice




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: