The main issue with AI, and ironically the reason why ChatGPT is the best one, is whom it works for.
AI doesn't work for the user. It couldn't care less if the user is happy or not. AI is designed first and foremost to make more money for the company. Its metrics are increased engagement and time on site, more sales, sales with better margins. Consequently, the user often has no choice or control over what the AI recommends for them. The AI is recommending what makes more sense for the company, so the user input is unnecessary.
Think of AI not as your assistant, but as a salesman.
One interesting consequence of this situation I found was that Youtube published a video "explaining" to creators why their videos don't have reach in the algorithm, where they essentially said a bunch of nothing. They throw some data at the AI, and the AI figures it out. Most importantly, they disclosed that one of the key metrics driving their algorithm is "happiness" or "satisfaction" partially gathered through surveys, which (although they didn't explicitly say this) isn't a metric that they provide creators with, thus it's possible for Youtube to optimize for this metric, but not for creators to optimize for it. That's because the AI works for Youtube. It doesn't work for creators, just as it doesn't work for users.
People are complex creatures, so any attempt at guessing what someone wants at a specific time without any input from them seems just flawed at a conceptual level. If Youtube wanted to help users, they would just fix their search, or incorporate AI in the search box. That's a place where LLMs could work, I think.
When you look at things this way, the reason why Netflix/Youtube get rid of old stuff has nothing to do with users, but with some business strategy that they have that differs from the music industry.
AI doesn't work for the user. It couldn't care less if the user is happy or not. AI is designed first and foremost to make more money for the company. Its metrics are increased engagement and time on site, more sales, sales with better margins. Consequently, the user often has no choice or control over what the AI recommends for them. The AI is recommending what makes more sense for the company, so the user input is unnecessary.
Think of AI not as your assistant, but as a salesman.
One interesting consequence of this situation I found was that Youtube published a video "explaining" to creators why their videos don't have reach in the algorithm, where they essentially said a bunch of nothing. They throw some data at the AI, and the AI figures it out. Most importantly, they disclosed that one of the key metrics driving their algorithm is "happiness" or "satisfaction" partially gathered through surveys, which (although they didn't explicitly say this) isn't a metric that they provide creators with, thus it's possible for Youtube to optimize for this metric, but not for creators to optimize for it. That's because the AI works for Youtube. It doesn't work for creators, just as it doesn't work for users.
People are complex creatures, so any attempt at guessing what someone wants at a specific time without any input from them seems just flawed at a conceptual level. If Youtube wanted to help users, they would just fix their search, or incorporate AI in the search box. That's a place where LLMs could work, I think.
When you look at things this way, the reason why Netflix/Youtube get rid of old stuff has nothing to do with users, but with some business strategy that they have that differs from the music industry.