IMVU had been doing continuous deployment since I joined in 2005. We even forced daily mandatory upgrades to our client! Luckily our early adopters forgave us back then... We followed a simple algorithm: ship it, and if it sticks, work on it more. If anything regresses, write a test.
This left us with a "bag of parts" product and a lack of technical vision. We've been paying for that since. On the other hand, we've managed to double the business year on year while slowly improving the code, so it's hard to complain too much. ;)
While I am still a huge proponent of test-driven development and automated tests in general, I now see that a coherent technical design is more valuable. Instead of investing heavily in end-to-end tests, we should have mercilessly refactored the code to reflect the product, at the cost of short-term regressions.
So in your opinion, has continuous deployment actually made it harder to do the kind of proactive refactoring you describe? Or do you just mean that in an ideal world you would have done both from day one?
Not looking to stir a flamewar - I just found Timothy's posts on continuous deployment quite inspirational, so I'm very interested to hear more viewpoints on how well it works.
I'll chime in too. I've worked with Chad at IMVU for almost 4 years now, and here's my opinion, for what it's worth.
I actually think that continuous deployment makes it significantly easier to do proactive refactoring. One of our favorite things to do at IMVU is to refactor lots of code and see what breaks in the thousands of tests we run before deploying.
It's true that this sometimes exposes deficiencies in our test coverage, but most of the time, our massive build cluster does a great job of sussing out any of the unknown dependencies that might have broken as a result of refactoring. I, for one, would be more afraid to go back and fix an ugly system if I didn't have the safety net of the continuous deployment process, because it makes it so that I don't have to necessarily have the whole system in my head before starting. I would never make huge changes if I had to live with that fear.
I think Chad is right, though -- it's certainly better if something has a coherent technical design up-front, because when you go to read the code and understand they system, it just makes sense. You don't end up with as many landmines that require tons of refactoring to be able to iterate on them.
I'm Chad, the author of the post in question.
IMVU had been doing continuous deployment since I joined in 2005. We even forced daily mandatory upgrades to our client! Luckily our early adopters forgave us back then... We followed a simple algorithm: ship it, and if it sticks, work on it more. If anything regresses, write a test.
This left us with a "bag of parts" product and a lack of technical vision. We've been paying for that since. On the other hand, we've managed to double the business year on year while slowly improving the code, so it's hard to complain too much. ;)
While I am still a huge proponent of test-driven development and automated tests in general, I now see that a coherent technical design is more valuable. Instead of investing heavily in end-to-end tests, we should have mercilessly refactored the code to reflect the product, at the cost of short-term regressions.
I hope that answers your question, Chad