Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, but now in some of the use cases they demoed the data itself never leaves the device. The AI models are running directly on the device, so there is no need to transmit user data to cloud servers. This is on contrast to all of their apps running in the cloud and your data living centrally off device. I find this new approach to be a nice balance for user privacy and the necessity of data aggregation in order for the models to be constructed.


Did they say that the data never leaves the device? Just because it's local inference doesn't mean it doesn't leave the device. They have a strong incentive to call home with the transcripts/metadata to improve their model on you.

I haven't had the time to wade through the barrage of information Google has put out in the last few days (ironically)...did they explicitly say that?


The explicitly stated in the presentation that all AI modeling data is processes on device, and then only anonymized lessons learned are sent back to google.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: