If you want proper answers, yes. If you want to rely on whatever reddit or tiktok says about the book, then I guess at that point you're fine with hallucinations and others doing the thinking for you anyway. Hence the issues brought up in the article.
I wouldn't trust an LLM for anything more than the most basic questions of it didn't actually have text to cite.