Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's my understanding that dropout[1] is also an important aspect of training modern neural nets.

When using dropout you intentionally remove some random number of nodes ("neurons") from the network during a training step.

By constantly changing which nodes are dropped during training, you effectively force delocalization and so it seems to me somewhat unsurprising that the resulting network is resilient to local perturbations.

[1]: https://towardsdatascience.com/dropout-in-neural-networks-47...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: