I recently came upon an article and corresponding infographic on software sizing, the basic gist of the webinar and article was that properly estimating software size is important? Why? Well obviously, if you estimate incorrectly you have some problems:

  • Estimate too high, and you allocate too many resources and spend money you wouldn’t do otherwise. I don’t think too many organizations worry too much about this situation.
  • Estimate too low, which is usually the case, and you end up missing deadlines. But let’s think about what this means at a deeper level:
    • Cost overruns in the form of overtime and/or hiring people or contractors to help you finish what you started, when you want to finish.
    • Defect overruns because of all of the above! When you are adding people to take up the slack trying to catch a project up, what happens? Mistakes. When people are working overtime and tired, what happens? Mistakes. And these things tend to compound themselves over time.

So, why do we usually underestimate? Because sizing and effort required is not linear. Not to say that it is an exponential function, but certainly effort required accelerates as a project becomes larger. Its simple physics. That’s why they tell you to drive 55 mph to save gas.

You could counter that agile takes care of all this? Well it does in the sense that you should be able to deliver a working product to the customer at the prescribed or promised date. Let’s say you undersized your estimate by 30%, then you’d deliver 70% of the features you promised (on time), but it would still work. The only problem is you end up with an unhappy customer, because they are expecting 100% of the features, and on time. Or since the function is not linear, you could deliver all 100% but you’d most likely be more than 30% late, unless you added significantly more resources.

The only problem that agile solves when you inaccurately estimate software size is that you can deliver some form of ‘working’ software that is ‘done’; HOPEFULLY.