Hacker News new | past | comments | ask | show | jobs | submit login

Logistic regression is usually assumed to be regularized in machine learning. It's not often you find practical problems that don't need regularization.



Indeed. You would notice that I had said as much. For the analogy between SVMs and LR to work you need L2 squared regression specifically, other regularizers, for example L1 wont work for the analogy. L2 squared is the key for the connection with Hilbert space and kernels to fall out automagically




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: