Logistic regression is usually assumed to be regularized in machine learning. It's not often you find practical problems that don't need regularization.
Indeed. You would notice that I had said as much. For the analogy between SVMs and LR to work you need L2 squared regression specifically, other regularizers, for example L1 wont work for the analogy. L2 squared is the key for the connection with Hilbert space and kernels to fall out automagically