> Increasing numbers of people who consume content on the Internet will completely sacrifice their ability to think for themselves.
Bless the author's heart.
All the major social media apps have been doing machine learning-driven getNext() for years now. Well before LLMs were even a thing. The Youtube algorithm was doing this a decade ago. This isn't on the horizon, we've already drowned in it.
A few years ago I was on an airplane back from Asia and I saw for the first time somebody using both hands to scroll tiktok.
A woman in front of me had her phone cradled in both hands, with index and thumb from both hands on the screen - one hand was scrolling and swiping and the other one was tapping the like and other interaction buttons. It was at such a speed that she would seemingly look at two consecutive posts in 1 second and then be able to like or comment within an additional second.
It left me really shaken as to what the actual interaction experience is like if you’re trying to consume short form content but you’re only seeing the first second before you move on.
It explains a lot about how thumbnails and screenshots and beginnings of videos have evolved overtime in order to basically punch you right in the face with what they want you to know.
It’s really quite shocking the extent to which we’re at the lowest possible common denominator for attention and interaction.
Even as horrible as the current state of that already is, there is a difference between letting AI pick the next video in line or having the next video be DONE by AI
That's what most people would say - but why do they say this?
As I understand it:
1. Because machine-generated content is not as good. Recent technical improvements are (IMHO) showing obvious and significant improvements over last year SOTA tech, indicating that the field is still very green. As long as machine-generated content is distinguishable, as long as there are quirks in there that we easily notice, of course it'll be less preferable.
2. Our innate "our vs foreign" biases. I suspect that until something happens to our brains, we'll always tend to prefer "human" to "non-human", just like we prefer "our" products (for arbitrary definition of "our" that drastically varies across societies, cultures and individuals) to other products because we love mental binary partitioning.
Not always! I have definitely had some AI-generated songs ("BBL Drizzy" being a notorious example) that were stuck in my head for weeks. I think the music industry is at the greatest risk in the near term.
BBL Drizzy seems to me like a case where the cultural zeitgeist was more important than the actual additions made by AI. The lyrics were human - King Willonius admitted as much - so wasn't it just Udio AI reading them out + sampled backing track? Then Metro Boomin remixed the far more popular version by sampling bits and pieces, and I think that his contributions were 100% transformative. There's no way BBL DRIZZY BPM 150.mp3 could have been made by an AI any time soon
Bless the author's heart.
All the major social media apps have been doing machine learning-driven getNext() for years now. Well before LLMs were even a thing. The Youtube algorithm was doing this a decade ago. This isn't on the horizon, we've already drowned in it.