Hacker Newsnew | past | comments | ask | show | jobs | submit | bastien2's commentslogin

There is yet to actually be a use case for chatbots that isn't profit-motivated.

The entire effort to find a problem for them to solve has meant ecological harms at a scale that makes crypto mining seem trivial.

The data required has copyright and other commercial use restrictions on it. Much of the data being overtly restricted was never available for harvest to start with.

Given these facts, bringing and end to the LLMs an snaping techbros out of their latest delusion is a good thing.


Oh come on, Proton, you're supposed to be better than this. Chatbots are proven privacy-violation engines. If you're going to act like Google, there's zero reason to pay for your services.


This is not a chatbot and is in no way similar to what Google does.

It's opt-in, open source, runs on device, does not use your emails to train the model, and most important of all does not grant Proton access to your emails.

What part of that do you see as a privacy violation?


Oh look, carcinisation for Apple products.

ipodisation: the tendency for non-iPod Apple products to evolve iPod-like features over time.


Implying apple products are not cancer on their own right.


Carcinisation is a term unrelated to cancer.

https://en.wikipedia.org/wiki/Carcinisation


It is, however, a term related to Cancer.

https://en.wikipedia.org/wiki/Cancer_(constellation)


Lol, I was thinking in terms of the disease or the Internet slang, but you're right of course.


Cancers are serious conditions that are the cause of 25%-30% of all deaths. 1 in 2 people will have a form of cancer in their lifetimes. Cancers can have devastating long-term consequences beyond the illness itself, even if you survive. Treatment, recovery and sometimes remission is physically and mentally damaging.

But perhaps I'm the one who is wrong. Please, explain what you meant exactly so I can understand your perspective.


I am not the OP, but "cancer" is used as, and I quote, "(figuratively) Something damaging that spreads throughout something else". In that sense, the commenter could be implying, for example, that Apple products damage the electronic gadgets industry with their closed and anticompetitive policies, spreading silently (because users buy more of them to use the whole ecosystem).


That, and cause some hard to repair changes to people's understanding and evaluation of hardware and software.


Can you give some other examples?


Chatbots can't teach critical thought or ethics. They need to be able to understand language and bias first, and that's an as yet unsolved problem.

Until a chatbot is provably correct and ethical in its output, it must not be used to teach.

Case in point: the slop image attached to the announcement has the typical malformed hands and ghoulish faces problems.


Are humans provably correct and ethical in their outputs?


This smacks of the infinite growth fallacy.


That really is the solution: stop misusing phone numbers as secure channels.


Just remember: the only reason they threw a trillion dollars at genAI is because they thought they could lay off their entire creative staff


It's more that VR/AR is a high-cost solution still in search of a problem that makes it profitable.

So far the only market for VR is niche entertainment. VRC is fun, but VR games have been arounds for decades and never gotten past novelty.

AR has value as an accessibility tool because it can get around the high cost of some infrastructure changes. But like all accessibility tools, profit-focused eyeballs see it as a small, low-margin market. Enabling human rights is not considered profit.


> So far the only market for VR is niche entertainment.

That's not true. VR also thrives in corporate. Niches, yes. But it does do well there. Training and simulation scenarios in particular. There's some tools like uptale and arthur that capitalise on this.

And VR games are really really good and add a lot of immersion. I don't like to game without it anymore.

I don't think VR is for everything and everyone but there certainly are usecases for it.


AI is the last thing you want involved in accounting. Book-keeping and tax preparation are exercises in precision at a very high level of semantic knowledge. You have to understand why money flows through a company. These are skills the chatbots are proven incapable of demonstrating. Moreover, the parts that can and should be automated already have been for the last 15 or so years, with automatic transaction importing and prefiltering tools (none of which used chatbots).

The average business owner is usually overwhelmed with the backend of running a business by the time they come to me. If a client told me they used accounting software with AI features, the first thing I'd do is fully audit their books and review any filings done using chatbot-affected records.


This just in: Google Spyware has features accessible only to Google.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: