Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it is. I also think this is what OpenAI did. They’ve carefully crafted the data.

I don’t think they have an ensemble of 8 models. First, this is not elegant. Second, I don’t see how this could be compatible with the streaming output.

I’d guess that GPT4 is around 200B parameters, and it’s trained on a dataset made with love, that goes from Lorem Ipsum to a doctorate degree. Love is all you need ;)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: