Boring answer... but Chat GPT? I asked it what was new in Java 8 and it enumerated a number of features. I could proceed to ask it about more details for each, and then continue Java version by version and I think it would be quite exhaustive.
The other day I did something similar for cell biology. As a layman I wanted to learn more about DNA, mRNA, tRNA, translation, transcription, mitosis, miosis, etc. I think I learned it much better by querying Chat GPT than say reading Wikipedia. Something about the interaction, to ask for more details on what was not clear that helped me understand it.
When I’ve asked chatgpt to explain something to me that I’m very knowledgeable about, distributed systems, it was flat out wrong about many fundamentals in a way that would have seemed completely plausible to a novice. It was making up algorithms, complete with fake but believable names, to solve problems that are provably unsolvable.
My wife, a physician, reported similar errors when I got her to ask it medical questions.
Using chatgpt to learn about something new is flat out dangerous.
You have to provide more context, so that it has something to bridge in the high dimensional space.
> The Byzantine Generals Problem is a more difficult problem than the Two Generals Problem because it requires a consensus algorithm that can tolerate the presence of faulty actors. There are many solutions to the Byzantine Generals Problem, including Byzantine Fault Tolerance, which is a common technique used in distributed systems to ensure that the system can continue to operate correctly even in the presence of faulty actors.
One can't jump over a chasm, you have to literately bridge from the known to the unknown.
You're supporting their point. You can't go to ChatGPT for knowledge you don't already have because it will confidently spout garbage, which is what was being suggested "whats new in java X". If you can build that bridge yourself then you aren't asking it for knowledge, you're asking it to format things for you.
I disagree, I use it all the time for things I don't know about. I then feed those discoveries into semanticscholar, scholar.google.com, wikipeida, libgen.is, etc.
I then ask it so summarize and ELI15. If ChatGPT is feeding you bogus knowledge then you are already consuming bogus knowledge over your existing channels. If you aren't including feedback into your understanding you are recording, not learning.
RP @johl@mastodon.xyz: I wish more people understood that "I want the computer to generate a natural language text that sounds like a plausible answer to a question about x" and "I want the computer to answer a question about x" are two very different problems.
There are bullshitters in the real world, but for the most part they tend to just say "I don't know" in a really convoluted way. They won't, for example, completely fabricate entire concepts and speak confidently and in detail about them.
For example, I asked ChatGPT about the meaning of a non-existing verb ("to spoink"). It originally said the word didn't exist, but when I said "are you sure? we talked about it before, it's something to do with trifle" it invented an entire episode of the IT Crowd (Season 3, Episode 3), talked at length about how that word appeared twice. It claimed that one character - Moss - created it as the sound a metal ball would make if it hit a hard surface, and also that it's what happens when you leave a trifle out uncovered in the open and it goes lumpy. It's creative, but it's nonsense.
Very few humans could pull this off convincingly or would even attempt it. They would say "no we didn't talk about the verb 'to spoink' that sounds like nonsense".
Haha :D Honestly though, if creative writing and absurd humour was the goal, they absolutely nailed it. I was particularly impressed with the onomatopoeia of the first “spoink” definition
The other day I did something similar for cell biology. As a layman I wanted to learn more about DNA, mRNA, tRNA, translation, transcription, mitosis, miosis, etc. I think I learned it much better by querying Chat GPT than say reading Wikipedia. Something about the interaction, to ask for more details on what was not clear that helped me understand it.