Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To be blunt: how do you know it's not an exfiltration app that will suck down your entire Drive and upload it to their sponsor's ML training engine?

Text editors are great, but hand-installed editors[1] running on the local filesystem of a developer-maintained personal device are a very different threat model than an app available to everyone in the Play Store.

[1] And even then they tend strongly to be boosted by a large community of (usually) open source developers attesting to it, usually by inclusion in something like a "Linux Distro" which carries a strong promise of well-audited software. Emacs and VSCode and whatnot skate on reputation, basically, but the community tends to frown on "here: download my new binary tool for all your editting needs!".



I like how ML training is the worst thing you can think of and not stealing your identity and bank account information and all your money or seeing nudes or something actually damaging that normal people care about.


I was trying to be trendy and hip and avoid hyperbole. But yeah, that too. Also boring stuff like corporate espionage and malware distribution.


Are you assuming that the ML company does not sell its training data to scammers or others that will sell to scammers?


Yes, I am assuming that AI won’t sell the text of my chats to scammers just like I assume Google won’t sell my Google search history to anybody that wants to personally hurt me. I distinguish between an ad company wanting to make money showing me ads and an individual calling my parents trying to get them to send the scammer money saying that I’m the hospital so cash app them $600 please.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: