>The way around that is that is for LLM-based tools to run a regular search engine query in the background and feed the results of that in alongside the prompt.
Hardly better, as soon those "search engine results" would be AI slop themselves, including actual published papers (phoned-in by using AI, and "peer reviewed" by using AI from indifferent reviewers)
Hardly better, as soon those "search engine results" would be AI slop themselves, including actual published papers (phoned-in by using AI, and "peer reviewed" by using AI from indifferent reviewers)