I also wonder if having a hallucination-free LLM is even required for it to be useful. Humans can and will hallucinate (by this I mean make false statements in full confidence, not drugs or mental states) and they’re entrusted with all sorts of responsibilities. Humans are also susceptible to illusions and misdirection just like LLMs.
So in all likelihood there is simply some state of ‘good enough’ that is satisfactory for most tasks. Perusing the elimination of hallucinations to the nth degree may be a fools errand.
Tools are not people and people should not be considered as tools. Imagine your hammer only hitting the nail 60% of the time! But workers should be allowed to stop working to negotiate work conditions.