> Those who undergo the birth trauma, like Google, are sitting on a river of AI gold and putting AI into everything; those who fail will whine on surveys about how AI is a scam and overhyped and an AI winter will hit Real Soon Now Just You Wait
I wouldn't call Google as sitting on a river of AI gold.
It is using AI to make more gold from the same mine, instead of the obvious gold, it scratches and extract gold from the dirt and rocks. But it's still doing it from that same mine.
And that's exactly what AI is, garbage in, garbage out. I used to be a machine learning engineer in a company that is about as old as the dinosaur. They wanted to apply that gold digging AI machine google had on a landfill, which didn't quite pan out the way they liked it.
Not a machine-learning engineer per se, but I work with some machine-learning technologies at Google. Perhaps that's when you know that AI has truly become embedded in the fabric of a company: when it's become just another tool that ordinary engineers are expected to know and use to solve problems.
Anyway, I think AI is a sustaining innovation, not a disruptive innovation. It makes existing businesses work better, but it doesn't create new markets where none previously existed like the Internet did. Google makes a ton of money off AI; the core of the ads system is a massive machine-learning model that optimizes ad clicks like no human could. But that only works because they already have the traffic and the data, which they got by positioning themselves as the center of the Internet when it was young.
I do agree that companies need to adjust their processes to take full advantage of AI rather than expecting it to be a magic bullet, but I don't know if I really think "birth trauma" is the right metaphor. More like adolescence; it's a painful identity shift, and some people never successfully make the leap. Those who can't don't die, though, they just become emotionally stunted adults that never reach their full potential.
Search was pretty actively against AI usage at the time I left it in 2014. Much of this was because of Amit Singhal, though: he had a strong belief that search should be debuggable, and if there was a query that didn't work you should be able to look into the scores and understand why. There were AI prototypes that actually gave better results than the existing ranking algorithms, but weren't placed into production because of concerns on maintainability. I have no idea if this changed after Amit left.
I work on Assistant now, since recently rejoining Google, and it uses AI for the usual suspects: speech and NLP.
Google uses a ton of AI because they have a ton of data that’s easily categorized and has clear success criteria. Without those things, it’s doubtful that AI would’ve done Jack squat for google.
I wouldn't call Google as sitting on a river of AI gold.
It is using AI to make more gold from the same mine, instead of the obvious gold, it scratches and extract gold from the dirt and rocks. But it's still doing it from that same mine.
And that's exactly what AI is, garbage in, garbage out. I used to be a machine learning engineer in a company that is about as old as the dinosaur. They wanted to apply that gold digging AI machine google had on a landfill, which didn't quite pan out the way they liked it.