One cheap (in time and money) complement to A/B or multivariate testing that the author doesn't mention is usability testing, specifically remote testing. Before we launch a test, we always run remote usability tests.
Feedback like this should be taken with a grain of salt, since these people are testers, not necessarily like your users in all respects. But it's still really valuable. I've caught numerous errors that test data would not help me understand easily.
Combine remote usability testing through something like usertesting.com with prototyping, and you've got a really rapid way to get feedback on the cheap, even if you don't have enough site visitors to get statistical significance on a reasonable time frame.
Feedback like this should be taken with a grain of salt, since these people are testers, not necessarily like your users in all respects. But it's still really valuable. I've caught numerous errors that test data would not help me understand easily.
Combine remote usability testing through something like usertesting.com with prototyping, and you've got a really rapid way to get feedback on the cheap, even if you don't have enough site visitors to get statistical significance on a reasonable time frame.