Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Just because they're rich doesn't mean that they can or will fund features like this if they can't justify the business case for it.

Amazon is a business and frugality is/was a core tenet. Just because they can put Alexa in front of LLMs and use GPU hours to power it doesn't mean that is the best reinvestment of their profits.

The idea of using LLMs for Alexa is so painfully obvious that people all the way from L3 to S Team will have considered it, and Amazon are already doing interesting R&D with genAI so we should assume that it isn't corporate inertia or malaise for why they haven't. The most feasible explanation from the outside is that it is not commercially viable especially "free" versus a subscription model. At least with Apple (and Siri is still painfully lacking) you are paying for it being locked into the Apple ecosystem and paying thousands for their hardware and paying eyewatering premiums for things like storage on the iPhone



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: