When you have the idea for a new computer application – whether written in-house, outsourced or bought off the shelf – perhaps the first thing you ought to do is define what would constitute “success”, and how you’re going to measure that. How will you know, in the long term, whether your money was well spent? If you don’t know that, there’s a good chance it wasn’t. Worse still, you’re likely to make the same mistakes again in your next project.

What you definitely won’t want to do is employ the traditional measures of success – being “on time” and “on budget”. These merely tell you that you spent X hours and Y pounds on writing, testing and installing the new system, costs you must always expect to incur and which are all on the minus side of the equation. The fact you didn’t spend more than expected is good news, but it doesn’t measure the success of the whole system, only that of its writing, testing and implementation.
Not defining how you will measure success can often lead to failure
What about the positive side? What about the return on investment (ROI)? Has the new system improved your productivity, lowered your costs, made your users’ working lives better? These are also things you have to measure before you can say whether a new system is a success.
A new invoicing system, for instance, might make it easier to send statements by email to all your customers, saving the time and cost of printing and postage. It might also reduce the time taken to introduce a new invoice format. These criteria, therefore, should be written into the original proposal as measures of success:
1. Send statements by email to 90% of customers, saving £1,000 on paper and postage and half a day’s time each month.
2. Reduce time to implement new invoice format from two days to four hours.
Once your new system is running, you can measure its performance against these criteria to see how you’re doing. Depending on your chosen measures, you might be able to assess success from day to day or week to week, or it might take several months to get a clear picture.
For example, statements are usually sent once a month, so the first measure for the new invoice system can only be assessed monthly.
After the first month of running the new system, you might have collected email addresses for 60% of your accounts – quite a big step towards your goal. You could then write to all the customers whose email addresses you don’t have and ask for one. Follow up that letter with a phone call two weeks later and you should be able to achieve the 90% target quite soon.
If you’re still not there by the end of the second month, you’ll see nonetheless how you’re progressing towards the target and can take further action to encourage more customers to supply email addresses.
Had you not defined how you were going to measure success at the outset, none of this would have been possible and you wouldn’t have fully achieved the potential cost savings of your new system. Not defining how you will measure success can often lead to failure.
Sometimes it’s difficult to measure success. The expectation that your invoicing system will reduce the time to implement a new format from two days to four hours seems like a nice concrete measure, but it’s an action that happens infrequently.
The best thing you can do is perform an artificial test as soon as possible. Your company may not need a new invoice format right now, but you could design one anyway and go all the way through the process without taking it live. The point is to test the success of this subsystem before you wind down the development team and put the code away. If you really do need a new invoice format in two years’ time and it takes four days rather than two, that’s a bit late to be finding out. Testing for success early gives you a chance to fix such shortcomings; failing to test leaves you with no chance.
A mnemonic for measuring success that’s been used for decades is WILU (pronounced “will you”), which stands for Working, Installed, Liked and Used. It’s just as relevant now, with our self-service app stores on phones, tablets and PCs, as it was back in the days of mainframes and mini-computers.
Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.