That paper was flawed in many ways, but it had a catchy name so lots of 'fluencers and media pounced on it and slopped some content based on the title alone. Chances are it will be relegated to the blooper section of LLM papers, just like that "training on LLM outputs leads to model collapse" paper was...