Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ha I see - I've been training ML models with millions of data points and millions of features on CPUs for a decade now, I didn't know I needed a GPU - I guess I've been doing it all wrong all this time!


>I've been training ML models with millions of data points and millions of features on CPUs for a decade now

That is why it is taking a decade to train your models. If you upgraded to a GPU, you could train them much faster :)


> didn't know I needed a GPU - I guess I've been doing it all wrong all this time

No, not the whole time, just these last couple of years :)


I would be very surprised if a ML pipeline didn't benefit from some GPU acceleration. What kind of algorithms are you using?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: