Why are we so bad at estimating our time?
October 11, 2013 11:57 AM   Subscribe

I am looking for informed opinions and articles on why we are so bad at estimating our time on projects, specifically from an IT project management-perspective.

I feel like it's a big joke, like 90% of all projects ever completed have gone over time and budget. And if they don't, they often are rushed, poorly done, or drive people mad working on it.

I am a data scientist working in an IT-focused division of a major financial company. My division is particularly bad at estimating timelines for our projects. I've been tasked to gather data and try to figure out the main factors involved, such as use of new technologies, number of people involved across different divisions, or maybe even the ratio of PMs to developers on a given project.

As a first step, I'd like to figure out: What the current thinking on this question among IT-management folks? I have a good list of data I want to gather, but want to make sure I'm covering all the bases.

Why are we bad at estimating time in general? Why are we bad at estimating time as a group? From a project management perspective? From an IT perspective? I've been trying to search for weighty, possibly academic, articles about this question, but all I am coming across are fluffy self-help type articles. On a general basis, this seems like a behavioral economics question, but I am interested in this from an IT-department perspective.

Thanks in advance for your help!
posted by Tooty McTootsalot to Technology (16 answers total) 38 users marked this as a favorite
 
Best answer: I guess "The Mythical Man Month" is one of the original works on this - and it is still relevant on some of the political elements. You could also have a quick scan down this list of cognitive biases - the majority of which would seem to be potentially relevant to your question - Optimism Bias to give just one example.
posted by rongorongo at 12:05 PM on October 11, 2013 [6 favorites]


Well, count me in as interested in this one. We handle infrastructure management for IT and whenever we spec projects on a fixed fee the overshoot is normally 30%.

The problem we have is that no matter how much prep / advance work we do to outline the project, something unexpected *always* bites us in the arse: a tool doesn't work the way it should, we discover something new that we've never run into, there's a piece of software/hardware that doesn't work the way we expected, etc.

I'd love to hear how everyone deals with this because it's abig problem for us.
posted by tgrundke at 12:07 PM on October 11, 2013


There was a thread on the blue about this last year. Thinking, Fast and Slow has a whole section on the psychology of time estimation and the correct way to estimate the time for projects. (To save you the effort of reading the book, the correct way to estimate is to look at the time similar projects have taken in the past and use that as your base estimate, and then modify the base estimate using your situationally-specific factors like a larger team, less skilled in the technology, etc).
posted by inkyz at 12:09 PM on October 11, 2013 [2 favorites]


Best answer: I always segmented estimates into three major categories.. point, range, and limit.

Point estimates were precise, required huge amounts of effort and were almost always guaranteed to be wrong unless after the fact. If contracting fixed price, however, they are worth the investment.

Range estimates attempted boundaries around the expected, based on imperfect info.

Limit estimates are not-to-exceed and subject to a lot less effort and accuracy. It's easy to pick a big number.

A special category, Rough Order of Magnitude (ROM) was what I called a 'number of zeros' estimate (i.e., how many zeros in this project... 5, 6, 7, or 8?) Surprisingly, it was good enough and could be done in minutes for a lot of stuff, allowing approaches to be ruled out or in quickly.

All applied to cost and schedule. These formed the basis of an internal program I wrote for an aerospace company to allow vaguely defined and technically advanced supportable estimates to be produced, based on decomposition of many years of experience from managers. Complexity levels were used to select similar historic programs from a menu and these had man hours for various labor classes and tasks. Worked pretty well for winning contracts, because it was SUPPORTABLE and really fast, compared to the post-it note approach used before. Some parts were transaction based... and involved things like number of circuit cards, number of cable connections, number of racks of equipment, etc. and you could show your basis, even if you had insufficient info. These were called 'allowances', not estimates. Big difference.

It was also inaccurate. And we knew it. It didn't have to be accurate. The project was going to drift in definition and detail from our assumptions, anyway, and whatever number we came up with was subject to willy-nilly wishful thinking and spontaneous editing by the next 10 levels of management and marketing. We expect to be able to claim 'change of scope' and do what was called 'get well' down the road a piece.

The latter of these points is relevant to your question. Your estimates are poor often because the estimate omits things you didn't think of, tasks that were concealed from you, impediments you can't foresee, the magic term of "create something" which has a problem with quantizing anyway, schedule compression from your predecessor activities perhaps from another group, and unforseens like acquisition problems.

The biggest, in my experience, has been scope creep (or project redefinition.) It spawned an entire department of contract administrators in my three aerospace engineering jobs. Also, it was never, ever, ever a career limiting issue. In that business, you were going to get the job done, and the customer was going to pay the bills, even if he screamed. Since his name was USAF, USN, USMC, NRO, or spook agencies, he came from a family with money. If he wanted photon torpedoes, there was really only one or two places to buy them.

To avoid estimate problems, I only give point estimates (fixed price) on work I can see. I give budgetary on work that may come after that, revising it to fixed price when it clarifies.

There is a huge class of mental activities, like problem solving and creativity, that are almost impossible to quantify. If your work contains them, and most jobs do, you have a problem estimating. If your work is rote, then estimating systems like MTM (methods and time management) are useful because they are based on standard times for things like reach, place, position, focus, move and classes for each category. (I tried to take the best pieces of these for my software estimating system in the rocket business and it kind of worked, but then, the contracts were 10 years between award and launch, and who knows? I was long gone. Our win percentage went way up, though, based on the appearance of support for our guesses.)

Yogi Berra said "Prediction is hard, especially if it's about the future", I think?
posted by FauxScot at 12:32 PM on October 11, 2013 [9 favorites]


The most specific relevant cognitive bias is called planning fallacy. There's research on it linked there. As an aside, I recall Stumbling on Happiness discussing phenomena related to envisioning the future, e.g. a lobotomy technique that impacts your ability to imagine future events, and concluding that we don't have a lot of capacity for the task and we may actually be happier when we have none.
posted by Monsieur Caution at 12:38 PM on October 11, 2013 [1 favorite]


Best answer: Here are references to some articles which our research group used to make some points about project-time management of large, multisite clinical trials. As you can see, some of them appear to be about software, some are R&D related, and others are more from psychological research, but I think they all could be applicable. (Note: I have not read any of these, but thought this list might be useful to you.)


Norris KP. The accuracy of project cost and duration estimates in industrial R&D. R&D Management. 1971;2(1):25-36.

Roy MM, Christenfeld NJ, McKenzie CR. Underestimating the duration of future events: memory incorrectly used or memory bias? Psychol Bull. Sep 2005;131(5):738-756.

Jorgensen M, Grimstad S. Software Development Estimation Biases: The Role of Interdependence. Software Engineering, IEEE Transactions on. 2012;38(3):677-693.

Kutsch E, Maylor H, Weyer B, Lupson J. Performers, trackers, lemmings and the lost: Sustained false optimism in forecasting project outcomes — Evidence from a quasi-experiment. International Journal of Project Management. 2011;29(8):1070-1081.

Sanna LJ, Schwarz N. Integrating temporal biases: the interplay of focal thoughts and accessibility experiences. Psychol Sci. Jul 2004;15(7):474-481.

Jorgensen M. Identification of more risks can lead to increased over-optimism of and over-confidence in software development effort estimates. Information and Software Technology. May 2010;52(5):506-516.

Sanna LJ, Parks CD, Chang EC, Carter SE. The hourglass is half full or half empty: Temporal framing and the group planning fallacy. Group Dynamics-Theory Research and Practice. Sep 2005;9(3):173-188.

Buehler R, Griffin D, Ross M. EXPLORING THE PLANNING FALLACY - WHY PEOPLE UNDERESTIMATE THEIR TASK COMPLETION TIMES. Journal of Personality and Social Psychology. Sep 1994;67(3):366-381.

Halkjelsvik T, Jorgensen M. From Origami to Software Development: A Review of Studies on Judgment-Based Predictions of Performance Time. Psychological Bulletin. Mar 2012;138(2):238-271.
posted by mean square error at 12:40 PM on October 11, 2013 [4 favorites]


Nthing the recommendation for The Mythical Man Month.

This has been a research topic for folks in software engineering, especially those examining process improvement aspects. The folks at Carnegie Mellon's Software Engineering Institute have done work in this area and have produced several publications, as well as methodologies. This is a recent book on software estimation. You might also want to read some of Watts Humphrey's stuff. There's also the work from Victor Basili and collaborators at the Software Engineering Laboratory / NASA Goddard Space Flight Center, to give you an idea of the types of metrics that were collected.
posted by research monkey at 12:47 PM on October 11, 2013


I am not an IT Project Manager, but I have worked on software projects as a Business Analyst for many years at several companies that developed both commercial and in-house software systems. None of my work has been on fixed price projects. Here are some of the things that, in my experience, have caused headaches in estimating effort and successful completion of projects.

1. Ambiguous or incomplete requirements. Business owners frequently provide either business requirements “on a napkin” or request “pie in the sky” systems with huge amounts of feature functionality. Accurate estimates are notoriously difficult when there are too few OR too many requirements.

2. Providing “SWAG” or high-level estimates. Often in the planning stages, PMs will ask for preliminary estimates to get a general sense of the size of a project. Many times however, the SWAG estimates turn into the “final” estimates, and the project is roadmapped and scheduled on the basis of “guessimates”. No good ever comes from that.

3. Shoehorning estimates into pre-determined schedules. In many cases, business has made marketing plans and/or specific client commitments to deliver software on a specific date. (Hopefully but not always a FUTURE date.) Then when IT provides estimates on delivering the promised system, PM comes back and says “That’s too high – try again.” This leads to huge levels of frustration for IT, PM and Business. It also often leads to negotiations between the three parties to reduce the feature set to meet the expected delivery date. And in many cases, the features that are pulled from the project are never provided in a future release.

4. Failure to plan for sufficient rework. It’s just a fact of life that there will be defects in a system at the end of the initial Development cycle. Any sane PM will factor in some time for testing and rework of defects. However, as a projects move toward the deadline, if there is danger of not meeting the date, testing and rework is the first (and only) method of adjusting to meet the deadline. PMs are notoriously loathe to slip due dates, and Business can be under tremendous pressure to generate revenues. And Dev takes however much time it takes, estimates or no estimates. So quality assurance often goes by the wayside. The estimated deadline may be hit, but quality and feature set may suffer.

5. Scope creep. As has been stated by others on this thread, there is often pressure to add requirements to a project - "moving the goalposts" - while also pressuring to keep to the original schedule. Again in this case, testing and rework are the martyrs to this occurrence.
posted by Billiken at 12:47 PM on October 11, 2013 [3 favorites]


Someone asked me "hey, could you run those performance tests today? It's all set up, just let them run for a couple of hours and send me the graphs." Okay, cool. With the information I have, my estimate is that this will take a couple of hours of actual time and a few minutes of active work.

Turns out "it's all set up" means that one guy ran a performance test once a few months ago using a rather complex tool that I've never seen before. I figure out where its configuration file is. It turns out to be using a data load that is irrelevant. I spend an hour writing a database query that selects something relevant. Now I get a bizarre error which I figure out is because the tool doesn't handle my CSV file's UTF-8 "byte order mark;" after another hour, I've found another tool to remove this. Now the tests begin to run, and it dawns on me that it's making requests for pages whose bulk is actually fetched through AJAX; the stuff I'm supposed to be testing isn't even fetched. After another hour I've gotten the tool to use a regex and fetch all the AJAX data.

And so on and so on. It's just a fractal nightmare of incidental complexity, unknown unknowns, mental exhaustion, desperation, interruptions. Horrible monsters creep out of corners I didn't even know were there. Everything I touch turns into spiders. If I even tried to "estimate" or "predict" this kind of stuff, I'd have to be ridiculously paranoid.

Trying to predict this kind of thing feels like measuring Britain's coastline. Quote: "A fractal is by definition a curve whose complexity changes with measurement scale." If I'm told to do a rough estimate, I'll be thinking of a problem which is completely different. The reality of the situation is always a bizarre bad trip, because IT – if you include the whole mess, with legacy code, external tools, and data – is extremely, mind-boggingly complicated.

Chuck Moore said: "I despair. Technology, and our very civilization, will get more and more complex until it collapses. There is no opposing pressure to limit this growth."
posted by mbrock at 1:02 PM on October 11, 2013 [3 favorites]


I read an article about this in a software quality assurance magazine a while back. Unfortunately I can't remember the author or title. However the essential idea is pretty simple.

If you're estimating the time it takes to complete a complex project, you usually break it into sub-projects, estimate the time for each, and then total:

Sub-project 1: 2 days
Sub-project 2: 1 day
Sub-project 3: 4 days
Sub-project 4: 3 days
PROJECT TOTAL: 10 days

The implicit assumption here is that the sub-project time estimates are averages. So some will take more time than average, some will take less time than average, so when you total up all these averages, that will make an excellent estimate for the time needed for the complete project.

But, that simply doesn't work. For example, Sub-project 1 is estimated for 2 days. The max possible time you can save by completing it under the estimate is 2 days.

But the max possible time you can LOSE by going over estimate is infinite. Like, it could go 1 day over, 2 days, 4 days, a week, four weeks, and month, a year, etc. Yes, it's not that likely you'll actually go a year over time-budget on a two-day project, but once in a while it does happen (your lead engineer drops dead, the system you planned to use for the project turns out to be insufficient so you need to replace it and all the servers before getting started, your city is hit by a tsunami, etc).

And because sub-projects tend to be sequential--you can't start sub-project #2 until sub-project #1 is complete--delays can add up without limit while the time you can save by speeding up the remaining sub-projects is extremely limited.

So in reality a project like the one above is likely to come in like this:

Sub-project 1: Estimated 2 days, actual 1 day
Sub-project 2: Estimated 1 day, actual 0 days
Sub-project 3: Estimated 4 days, actual 9 days
Sub-project 4: Estimated 3 days, actual 5 days
PROJECT TOTAL: Estimated 10 days, actual 15 days

To save space, I've only listed 4 sub-projects. But we all know the time estimate situation is worse for complex projects with many sub-projects.

Here is why: Let's say your project has 20 sub-projects. And each sub-project has a 10% chance of taking 2X the time estimated and a 5% chance of taking 5X the time estimated.

Those are low, low odds, right? But with 20 sub-projects, you'll nearly always have some of them come in at 2X the time estimated and you've even odds of at least one taking 5x the time estimated. Those lengthy delays--which you can simply plan on with large, complex projects--defy our intuition and dominate the time taken by the complete project.

But in large, complex projects with many sub-parts, low-probability events are not unexpected--they are to be expected. And the low-probability events, which our intuition has a hard time dealing with, actually dominate the time budget.

FYI, as I was hinting at in the paragraphs above, his solution was to assign percentage estimates to the time taken by various sub-projects and then us a monte carlo type simulation to project the range of expected project completion times. Never simply and naively add the sum of sub-project time estimates to get the total project time estimate.
posted by flug at 1:44 PM on October 11, 2013 [3 favorites]


This page has an explanation of the monte carlo simulation method of project time & budget estimation, and a downloadable spreadsheet where you can model different scenarios and see the results.
posted by flug at 1:54 PM on October 11, 2013


IT Project Manager here. I don't think it's possible to estimate effectively (nor is it worth the time to try), and although I understand that budgets for cost and time must be maintained, it's the nature of technology to defy estimates for implementation. Estimates are great for building decks. We know all about wood and nails and how much time it takes to construct x square feet. But with the newest ruby on rails and SOAP requests and integration with Outlook 2010 and Flickr APIs, we just don't have readily available information. Someone above mentioned requirements, and that's part of it as well, so many times we don't really know the requirements until after the first or second demo in front of key stakeholders.

This is all why the best development teams are not encumbered by cost and time constraints, at least at the Project Management level (maybe at the executive level). If your highest-level stakeholder (hopefully a VP) has the faith and budget to say "here's a problem, I'm going to throw x dollars at it over 6 months and we will evaluate at the end", then you have an environment that is ready for real Agile development. The team doesn't worry about schedules, just about "getting stuff done".

That all said, people are married to waterfall. I'm lucky enough to actually work in a government environment where Agile and SCRUM are accepted, and the successes have been great. My suggestion: stop caring about estimates, and just get started.
posted by joecacti at 2:07 PM on October 11, 2013 [3 favorites]


I know someone who headed up a medium-sized business that did very big projects. This person would have as many as 1,000 people at a time working on these big projects for a few months at a time. They were competitive bid projects for organizations who certainly paid attention to their bottom line.

His rule of thumb was to estimate a project as accurately as possible, then double it and add a profit margin. I get that this sounds dubious, but this was a quite successful business over the course of several decades.
posted by cnc at 2:28 PM on October 11, 2013 [1 favorite]


Oh yeah, there's that dance thing....

You give long and short estimates based on worst case and best case, advise the customer/client to hope for the best and budget on the worst, and then hope the worst case doesn't scare him off. That's the dance.

We all want the work. Those of us who have done a zillion projects know that there is a tendency for the worst case to materialize and the best case to leave town. What SHOULD happen, especially in software, big swaths of which are off-limits to our eyes, is sometimes critical. Hell, I avoided C for years in microcontrollers because I wasn't sure what the compiler was doing but I sure as hell could follow along with the one-line/one instruction aspect of assembly. Not practical these days, but the problem is even worse. At least in the 80's, pieces of your OS were upgraded every few YEARS, not twice a day like Windows. Who can plan for impacts like that? It can take a month/months to determine the assignment of a problem between hardware and software, or between software and other software for a nuisance issue. And for an intermittent or unverified failure? Who knows?

It's a maddening aspect of living in a world created by engineers and programmers and managed by business folks. Often, we're at cross purposes.

(I once had a bug that didn't surface for 20 years. It was there all that time, but needed a certain case to materialize. 20 years. By then, of course, I thought it was bug free. Imagine my surprise? Only the fact that I don't trust anything, ever, allowed me to skeptically look at my code and finally find it. Plan that in an estimate. That thing could be circling Venus by now, bug and all. Jeez.)
posted by FauxScot at 2:37 PM on October 11, 2013


Big issue I see - sometimes but also not necessarily in these types of projects - is last-time baseline projects including project numbers that are fanciful, to put it generously.
posted by Lesser Shrew at 5:45 PM on October 11, 2013


Here is an article on estimating time from the magazine I mentioned earlier:

Estimating Time, Effort, and Cost: Techniques and tools for accurately predicting project deliverables, by David Garmus/David Herron

The magazine is now called Better Software.

(I don't think this is the article I mentioned previously, however.)
posted by flug at 7:35 AM on October 12, 2013


« Older Please help me help my boyfriend with his social...   |   I need new sauces, veggies, dips now that tomatoes... Newer »
This thread is closed to new comments.