Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Also, 'millions of cores' is a ludicrously shitty, zero-clue answer. It's like asking how Eminem makes music, and saying 'millions of pills'. Like, yes, that's an input, but you're missing the entire method of creation, of converting the crude inputs into the outputs.

For my money - and, for what little it's worth, I work in this field – I think most of the impressive feats of data science attributed to 'machine learning' are really just a function of now having hardware capacity so insanely great that we're able to 'make the map the size of the territory', so to speak. These models are essentially overfitting machines, but that's OK when (a) it's an interpolation problem and (b) your model can just memorise the entire input space (and deal with any inaccuracies by regularisation, oversampling, tweaking parameters till you get the right answers on the validation set, then talking about how 'double descent' is a miracle of mathematics, etc).

Don't get me wrong, neural nets are obviously not rubbish. They are a very good method for non-convex, non-differentiable optimisation problems, especially interpolation. (And I'm grateful for the hype cycle that's let me buy up cheap TPUs from Google and hack on their instruction set to code up linear algebraic ops, but for way more efficient optimisation methods, and also in Rust, lol.) It's just a far more nuanced story than "this method we discovered and hyped up for a decade in the 80s suddenly became the key to AGI".



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: