Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Microsoft (using Azure DCs) built a supercomputer with 10,000 V100 GPUs exclusively for OpenAI. [0]

It is estimated that it cost around $5M in compute time to train GPT-3.

OpenAI has received billions in investment prior to launching GPT-3, including $1B from Microsoft in 2019.

[0]: https://blogs.microsoft.com/ai/openai-azure-supercomputer/



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: