Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Inferencing on this model works fine on Google Colab which gives Tesla K80 GPU with access to 12GB of GPU RAM. You can buy a used K80 for probably about $850, but it's not really ideal for putting in a home computer because of the cooling requirements.

[ deleted reference to 2070 Super ]



Used K80 can be had for $350 [1] Not bad actually (it's probably as fast as 1080Ti, and has 24GB of memory).

https://www.ebay.com/itm/NVIDIA-Tesla-K80-GDDR5-24GB-CUDA-PC...


K80 is 2 GPU chips with 12GB, so it's not always as good as one newer/larger GPU. Much more affordable though :)


If I remember correctly K80 memory is actually 24GB, not 2x12GB. This is a pretty important distinction in this context (training GPT-2).

Also, you can get at least 6 K80s for the price of a single RTX Titan (also 24GB). So it would be faster (I don't think RTX Titan is 6x faster than K80) and 6x more memory for the same price. It's a very good deal.


300w with passive cooling? o-O

How does that work?


You would cool the server.


since when RTX 2070 ship with 14GB of GPU Ram, Max memory for RTX 2070 super is 8 GB.


Oops, you are correct. I mis-read the spec sheet.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: