Efficacy of the Progressive Agile Estimating Scale

Posted on May 17, 2012 by Dean Leffingwell in Metrics, Team Practices

Ever wonder why using the Fibonacci (Or Cohn’s Modified) sequence relative estimating scale seems to work so well for agile teams? (Ok, me neither, really). But if you are interested in the mathematical roots of why “a little estimating (on a near log scale) helps a lot and a lot of estimating helps only little”, my friend, colleague, mathematician, agile sensai and zany Ukrainian Alex Yakyma felt compelled to make the mathematical argument here:


He also notes how these numbers tend to normalize across teams over time, and while that can create interesting conversations, we use this to good effect in the larger scale estimating we require in the agile enterprise.  When we do this, and team backlog estimates contain only time remaining, then applying weighted shortest job first lean estimating automatically ignores sunk costs, a key tenet of lean thinking. (for more on weighted shortest job first lean economic prioritization) , refer to Agile Software Requirements.

If you find any math bugs in the analysis in the post, please contact Alex, not me!