Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> SVM, linear regression, naive Bayes

When I studied ML in 2012, the very first course started with naive Bayes and went one from there. A decade after being away, I see a lot of people around me starting with neural nets to train a model that naive Bayes would be plenty enough for and never heard about naive Bayes. Is that only my experience?



It's useful to do learning on small toy problems for ease of debugging and speed of training, so if you want to learn how to apply a powerful technique, you're pretty much inevitably going to start learning it on something for which it's absolutely ineffective overkill. E.g. a common start for learning neural nets is a task like XOR which can be solved with literally a single machine instruction instead of training a ML model for it.

But also there are many tasks for which naive Bayes works, but a NN solution can be much more accurate if you're okay with it being also much more compute-intensive. E.g. things like sentiment analysis or simple spam filters are often used as a demonstration of naive Bayes, but you can do much better with more powerful models.


As with any super-hyped technology, people want to use the latest and greatest to solve it, even when traditional methods are cheaper, easier, more reliable and more accurate. See also: crypto.


Any data that is small enough to quickly iterate on for learning is small enough to use a simpler approach. The point is learning the techniques.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: