Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wouldn't call it deceiving. In order to be motivated to deceive someone, you'd need agency and some benefit out of it


1. Deception describes a result, not a motivation. If someone has been led to believe something that isn't true, they have been deceived, and this doesn't require any other agents

2. While I agree that it's a stretch to call ChatGPT agentic, it's nonetheless "motivated" in the sense that it's learned based on an objective function, which we can model as a causal factor behind its behavior, which might improve our understanding of that behavior. I think it's relatively intuitive and not deeply incorrect to say that that a learned objective of generating plausible prose can be a causal factor which has led to a tendency to generate prose which often deceives people, and I see little value in getting nitpicky about agentic assumptions in colloquial language when a vast swath of the lexicon and grammar of human languages writ large does so essentially by default. "The rain got me wet!" doesn't assume that the rain has agency


Well the definition of deception, according to Google and how I understand it, is:

> deliberately cause (someone) to believe something that is not true, especially for personal gain.

Emphasis on the personal gain part. It seems like you have a different definition.

There's no point in arguing about definitions, but I'm a big believer in that if you can identify a difference in the definitions people use early into a conversation, you can settle the argument at that.


I both agree that it's pointless to argue about definitions and think you've presented a definition that fails to capture a lot of common usage of the word. I don't think it matters what the dictionary says when we are talking about how a word is used. Like we use "deceptive" to describe inanimate objects pretty frequently. I responded to someone who thought describing the outputs of a machine learning model as deceiving people implied it had agency, which is nonsense


Isn’t that GPT Plus? Trick you into thinking you have found your new friend and they understand everything? Surely OpenAI would like people to use their GPT over a Google search.

How do you think leadership at OpenAI would respond to that?




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: