Hacker News new | past | comments | ask | show | jobs | submit login

> If you send it out past 16, it keeps matching the pattern as provided.

"If you modify it, it will give the correct answer"




Ah, you're right, it's pretty dumb then. Swing-and-a-miss, GPT-4.


Well, it's both dumb and smart: it's smart in the sense that it recognized the pattern in the first place, and it's dumb that it made such a silly error (and missed obvious ways to make it shorter).

This is the problem with these systems: "roughly correct, but not quite, and ends up with the wrong answer". In the case of a simple program that's easy to spot and correct for (assuming you already know to program well – I fear for students) but in more soft topics that's a lot harder. When I see people post "GPT-4 summarized the post as [...]" it may be correct, or it may have missed one vital paragraph or piece of nuance which would drastically alter the argument.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: