Hacker News new | past | comments | ask | show | jobs | submit login

Early in my career I saw a large legacy project that was riddled with bugs turned around after a senior developer insisted on having unit tests. No one else believed in the value of unit testing, so he added them on his own in his free time. Occasionally another developer would push up some code that broke the senior developer's tests, and he gradually got the upper hand because he now had proof that his tests were finding real problems.

Everyone started writing unit tests, and the code broke less. Developers became more confident in deploying, and eventually most PRs looked roughly the same: 10-20 line diff on the top, unit tests on the bottom. If there were no tests, the reviewer asked for tests. It became a fun and safe project to work on, rather than something we all feared might break at any moment.

I've since started insisting on having them as well, especially when I'm using dynamically typed languages. A lot of the tests I write in Python for example are already covered in a language like Go just by having the type system.




I programmed the first 10 years of my life in compiled statically typed languages (C, C++, Java, etc), then I needed to start programming in Ruby for production environments and initially I felt "naked"; I felt so insecure when building something and not having it compiling successfully. That's when I really got into Unit Tests, bugs as stupid as "vlue" instead of "value" typos can plague your codebase in languages like javascript, python, ruby, etc; and testing is the only way to find them (other than... in production errors).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: