I find it wild to suggest an LLM would be better at scouring data for gems than the person who wrote them. LLMs are better than us at going through large amounts of data, and that's it. They have no idea what is valuable there.
> I find it wild to suggest an LLM would be better at scouring data for gems than the person who wrote them
I mean, "an out there" idea sure, but wild? There are plenty of cases where people underestimated their own worth and value, and the potential impact of their ideas.
Sometimes it's valuable to have outsiders perspective on things. Old war veterans might not think twice about their love-letters between them and their partner, but taken together with a large collection of letters, historians can build new perspectives that we weren't able to see before.
> They have no idea what is valuable there.
Of course an LLM wouldn't know what is "valuable". It would require a person to have an idea of what could be valuable, and program the LLM to surface based on that, together with more things.
For example, I could imagine if I setup an LLM with the prompt "Highlight perspectives that you think are conflicting with other stated perspectives" to go through my own second-brain, it could reveal something I haven't considered before, granted it'll be able to freely query the db and so on.