Hacker News new | past | comments | ask | show | jobs | submit | __blockcipher__'s comments login

It’s the same thing. Obviously withdrawals and such are different but the core mechanism of disregulated reward processing leading to compulsive behavior engagement is exactly the same.


One obvious risk would be blunting of longer term GLP-1 receptor activation. Imagine type 2 diabetes but for ghrelin.

To use an analogy amphetamines have a honeymoon period, and it feels like a lot of people on these weight loss drugs haven’t been on them long enough to get past the honeymoon period and see what the effects are after 10, 20, etc years


It's possible. But, we've had another GLP-1 medication in use for about a decade and a half now - liraglutide. So far, we haven't seen evidence of that occurring.

I don't think anyone who is both informed and sane would suggest that it is impossible that there are negative long term impacts from taking the medication. Just that we have no current indication of them, and that being afraid about a "what if" without any concrete concerns when the alternative is the "continue being in one of the riskiest states possible for human health" is silly.


People don't realize that Ozempic is already a third generation GLP drug, Mounjaro is a 4th, and the try generation drugs are already in wide scale clinical trials.

We do in fact know a lot about how these drugs affect people by now, and as you point out, we have well over a decade of data on them.


It's less about the NSA having AI capabilities and more the inverse - the NSA having access to people's chatGPT queries. Especially if we fast-forward a few years I suspect people are going to be "confiding" a ton in LLMs so the NSA is going to have a lot of useful data to harvest. (This is in general regardless of them hiring an ex-spook BTW; I imagine it's going to be just like what they do with email, phone calls and general web traffic, namely slurping up all the data permanently in their giant datacenters and running all kinds of analysis on it)


I think the use case here are LLMs trained on billions of terabytes of bulk surveillance data. Imagine an LLM that has been fed every banking transaction, text message or geolocation ping within a target country. An intelligence analyst can now get the answer to any question very, very quickly.


> I suspect people are going to be "confiding" a ton in LLMs

They won't even need to rely on people using ChatGPT for that if things like Microsoft's "Recall" is rolled out and enabled by default. People who aren't privacy conscious will not disable it or care.


Why do you assume NSA have ChatGPT queries?


Why wouldn’t they, after the Snowden revelations?


Because ChatGPT is a sizable domestic business, and most large data collectors are enrolled in the NSA's PRISM program whether they like it or not.


I’ll believe it when I see it. A man can dream though.


Why? The guy tried to hire a hitman.


Because he was a coder working on his startup with no concern for the massive amounts of harm he was causing and many of the denizens of HN can relate to that.


Ross could hustle, and single handedly built a unicorn. He just needs to improve his regulatory exposure profile.


Agreed. You can forgive some of his other less than upstanding activities but the dude hired a hitman. I hope that’s a bridge too far for everybody.


I mostly agree, but I would like to see the full context of the hitman stuff. My understanding is that it was a law enforcement plant that encouraged him to hire a hitman and then referred him to one, which was of course also a plant. Still bad, but also seems a bit like entrapment.


He has spent 10 years in federal prison. He was never convicted of any murder for hire scheme.


No he didn't. There is literally no evidence he ever tried to hire a hitman or hurt anyone in any way.


> I think one solution could be in licenses that force companies/business of certain sizes to pay maintenance fees. One idea from the top of my head.

This just doesn't work. Fully open source software (as opposed to source available) is so much more useful than the alternative that there's always going to be an OSS fork for any sufficiently useful project. AFAICT Elasticsearch and Redis have not really "won" by their respective license changes but rather have just fragmented their own market and sown the eventual seeds of their destruction.


Doubly so in the bay area


Yes, the GP was clearly serious in their advice to exploit the coastline paradox in order to mislead would-be buyers! (/s)


I suppose I shouldn't be surprised, but was he really fired just for writing that blog post? (https://blog.paulbiggar.com/i-cant-sleep/)

That seems pretty crazy, although I suppose to play devil's advocate the ongoing, erm, 'conflict', was clearly interfering with his ability to output work ("I can't work. I code for 5 minutes before their bodies come back")


Companies don't usually announce performance related firings publicly on Linkedin.


I bet that average homeless person does too. 2% seems ridiculously low. $15 a month total on drugs? That only makes sense for someone who does no opioids, no stimulants, and just smokes 1 pack of cigs and has a single beer across an entire month.


Basically, a monk.

"Hey we gave $750/ month to these homeless guys- yes the ones dressed in those brown rags and with that strange haircut- they were so thankful."


The median American consumes no illegal drugs, no tobacco, and approximately 2 alcoholic drinks monthly.


> have work very hard to achieve a good standard of living where I don’t rely on others for my needs.

> Seeing as many people have helped you would you mind giving me that money and laptop you were given

One of these things is not like the other


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: