Why Can't Developers Estimate Time?

A few interesting points came up on a mailing list thread I was involved in. Here are a few of them. The original comments are presented as sub-headers / quoted blocks, with my response below. This isn't a thorough look at the issues involved, but what I thought were relevant responses. Note: I've done some editing to improve the flow and to clarify a few things.

Why can't developers estimate time?

We can't estimate the time for any individual task in software development because the nature of the work is creating new knowledge.

The goal of software development is to automate processes. Once a process is automated, it can be run repeatedly, and in most cases, in a predictable time. Source code is like a manufacturing blueprint, the computer is like a manufacturing plant, the inputs (data) are like raw materials, and the outputs (data) are like finished goods. To use another analogy, the reason Starbucks makes drinks so quickly and repeatably is because they invested a lot of time in the design of the process, which was (and is, ongoing) a complex and expensive task. Individual Starbucks franchises don't have to re-discover this process, they just buy the blueprint. I'll leave it as an exercise to the reader to infer my opinion of the Costa coffee-making process.

It's not actually always a problem that development time is unpredictable, because the flipside is that so is the value returned. A successful piece of software can make or save vastly more than its cost. Tom DeMarco argues for focussing on the high value projects for exactly this reason. Note that this does require a value-generation mindset, rather than the currently-prevalent cost-control mindset. This is a non-trivial problem.

By far the best explanation I've read of variability and how to exploit it for value is Don Reinertsen's Principles of Product Development Flow, which is pretty much the adopted "PatchSpace Bible" for day-to-day process management. And when I say "by far the best", I mean by an order of magnitude above pretty much everything else I've read, apart from the Theory of Constraints literature.

Here is the data from my last development project. (Histogram generated in R with 5-hour buckets: the horizontal axis shows the duration in hours for the user stories - 0-5 hours, 5-10 hours, etc; the vertical axis is the number of stories that took that duration). We worked in 90 minute intervals and journaled the work on Wave, so we knew task durations to a pretty fine resolution. (We did this for both client communication and billing purposes.) The result: our development times were about as predictable as radioactive decay, but they were very consistently radioactive. Correlation with estimates was so poor I refused to estimate individual tasks, as it would have been wilfully misleading, but we had enough data to make sensible aggregates.

Rule of thumb: take the estimates of a developer, double it and add a bit

The double-and-add-a-bit rule is interesting. When managers do this, how often are tasks completed early? We generally pay much more attention to overruns than underruns. If a team is not completing half of its tasks early, it is padding the estimates, and that means trading development cycle time for project schedule. Cycle time is usually much more valuable than predictability, as it means getting to market sooner. Again, see Reinertsen's work, the numbers can come out an order of magnitude apart.

Also, this is the basis for Critical Chain project management, which halves the "safe" estimates to condense the timescale, and puts the remaining time (padding on individual tasks) at the end, as a "project buffer". This means that Parkinson's Law doesn't cause individual tasks to expand unduly. I'm unconvinced that Critical Chain is an appropriate method for software though, as the actual content of development work can change significantly, as feedback and learning improves the plan.

People in general just make shit up

It's not just developers that are bad with estimates either. Everyone at some point is just winging it because it's something they've never done before and won't be able to successfully make a judgement until they have.

As a community we need to get away from this. If we don't know, we don't know, and we need to say it. Clients who see regular progress on tasks they were made aware were risky (and chose to invest in) have much more trust in their team than clients whose teams make shit up. It's true! Srsly. Don't just take my word for it, though - read David Anderson's Kanban.

Estimating is a very important skill and should be taught more in junior dev roles

I propose an alternative: what we need to do is teach to junior devs the meaning of done. If estimation problems are bad enough, finding out at some indeterminate point in the future that something went out unfinished (possibly in a rush to meet a commitment … I mean - estimate!) blows not only that estimate out of the water, but the schedule of all the current work in process too. This is very common, and can cause a significant loss of a development team's capacity.

88 responses
I agree with most of what you've said. We develop a time tracking app and from our own internal experience and the feedback from our clients we see that tracking everything you do does help a lot in estimating new projects. (especially if you work on similar stuff).
There are a lot of developers who simply hate time tracking and i agree to a point, sometimes it can feel like a burden but it does pay off in the end for everyone - developers, PM's and clients.
Hi Jan

Thanks for your feedback. Interestingly, we got into a good rhythm with time tracking, because each 90 minutes we recorded our work on Wave, primarily for the benefit of our client to see what we'd been working on, and to give her chance to ask questions etc. So it didn't feel like a chore at all. And it gave us this unexpectedly useful data!

Also I should point out that I'm the business owner, so I have a vested interest in time tracking. I can imagine it's not top of most developers fun things to do on a rainy day though.

Cheers
Ash

Why can't geologists tell me how much a rock weighs?
For those of us who don't speak R, can you explain what stories$DevTime means?
Nice post.

Regarding making shit up: I'd welcome more analysis of the phenomenon where the dialogue goes like this:

Customer: How long will X take?
Me: (not making shit up) I don't know. It sounds big, like weeks rather than days.
Customer: So three weeks then?
Me: I don't know.
Customer: I have people breathing down my neck and they need to know if this will be done in three weeks.
Me: It might. Or I might run over; I just don't know.
Customer: (exasperated) Can you give me something to go on here?

In a past life I would allow myself to be browbeaten into agreeing to a date. Now I'm trying to tread a careful line between not making shit up and not being helpful. I find it extraordinarily difficult on anything but the simplest task estimations.

People make up shit because they can and they usually have to. When the guy they report to is an ignorant fool (common situation), guess who he's going to consider knowledgeable: the guy who makes up shit or the one who says I don't know?
Ed - thanks for pointing out this unnecessary piece of jargon. I'm not much of a statistician myself, I should point out! I've added an explanation in the article.
Al - you make a good point. Unfortunately, as people, we seem to favour confidence over concrete evidence. I think it's important we get beyond this. As Tom De Marco also said: risk management is project management for grown-ups.
Have you seen Joel Spolsky's bit on Evidence Based Scheduling?
@ramarnat - Thanks for the reference! For the benefit of other readers, the Evidence Based Scheduling post is here: http://www.joelonsoftware.com/items/2007/10/26.html
The version I've always used is "make an honest guess at the time required to do the specific thing requested, double it, and increase the units a notch" so 2 minutes -> 4 hours; 1 hour -> 2 days. This is remarkably effective, but depends on the initial honest (unpadded) guess.

Double and add a bit might work for padded guesses.

I couldn't agree more. The biggest issues I face are the concepts of "done" and the fact that we try to believe that we have accurate estimates for things that we haven't done by basing it on SIMILAR items. The problem is that just because something is similar in 90% of its implementation, does not mean that it can be done in the same time with 90% accuracy in that estimation. Excellent article.
Great to finally read something honest on this subject. For years, many of us have been trying to say that development is 'creative engineering'. It wasn't an excuse, as you say there are different modes to development, and when the going is exploration-oriented, and recipe-making, of course the going is far far slower, than it is for well-defined problems.

An even more important issue is the value of the IP generated, and how easy it is to sell know-how/blueprints to competitors in low-cost-labour countries.

I worked in an industry (proprietary, naming no names) where I accrued a helluva lot of know-how, then this was sold-off to Bangalore chicken farms.

It is a wrench to have somebody else sell off ten years of concentrated focussed effort, leaving you without market value. No surprises that I now consider open source or nothing. No more proprietary. There's also the advantage of not having to re-invent the wheel, courtesy of internet collaboration. The situation is healthier now.

@jorjun Your IP comment is really interesting. Another comment I made (but didn't add here) was about commoditisation:

"The more a developers does a certain type of task, the better they they will get at estimating tasks of that type under the same circumstances. The problem is that while teams of developers are becoming more predictable at creating, say, bespoke corporate social networks, a team somewhere else is taking on a high risk project like Ning, commoditising a class of development tasks."

I believe we're on a constant cycle of (expensive!) bespoke software "research" where we discover new ideas, to commoditisation where that the issues are more around creating implementations, or just outright software re-use.

I'm not sure I'm describing the same situation, but if so, losing the investment of this development is a huge waste. I'm neither a proprietary- nor open-source-extremist, but I appreciate the value in maintaining this learning in the public domain, to prevent it being destroyed.

@adam_conrad You made that point much better than I did in the article. Software is too complex for our primitive pattern matching ability to apply. Postgres, Oracle and SQL Server are all relational database servers, so integrating them into an app should take about the same amount of time, right? :)
I feel that if you give a developer any amount of time, they will generally use it all or more.

Here's an example of why I might go over an estimate: If I think a task will take a week and we give the client an estimate of 2 and a half, I take that as having 1.5 weeks of extra time to try and figure out ways to make it better. After I use up the extra time I then start working on the task using the new knowledge that I acquired during research. Since I am probably using new ideas, it will most likely take longer than the initial week that we estimated and will therefor go over.

This may not be as common as just missing the time, but that is example that I run into from time to time.

Over the last few years clients of mine are beginning to understand that their sometimes lofty ideas can take while to make a reality. Time estimates are just that.
If a developer doesn't know how long it's going to take to complete a story, they should be given a half day or so to investigate the problem, discover the associated development problems, think of solutions and create their estimates afterward, rather than on the spot. It's often after working on the problem domain for a little bit that you unearth the underlying development issues that become more plain to more easily create time estimates.
I've always found that developers are terrible at estimating time, but shockingly good at estimating how much code something will take. I've had very good success getting estimates in LOC then using historical data to translate that into actual time estimates.

You can generate your own historical data if you have been tracking your time for the last few projects, and you have a decent source control system. Pluck the delta LOC number out of each release (there are tools to do this if it's not built in to your VCS), plot against man-hours, and see if you get something generally repeatable. We found that we were pretty consistent for similar types of problems - low-level OS/library work was different than UI or algorithm work.

I've been estimating web design and development projects for over ten years now. It never gets easier, but the estimates do become more accurate over time. When working with web developers on estimates we almost always have to double their estimates before putting it in front of the potential client. It's just the nature of developers (myself being one of them).

Take a look at a blog post I wrote up on this subject a few years ago on how to accurately estimate a web development project:
http://www.myintervals.com/blog/2008/09/05/how-to-accurately-estimate-a-web-design-project/

@Mark, I like the lines-of-code estimate idea. I think that is probably going to be better than some nebulous 'feel for the challenge'. But honestly, as is mentioned above, exploration is the key word here. And I think that good developers are good explorers. And quite often they are put upon for arbitrary and ignorant reasons. If painting was done this way, god forbid. Why not focus on the IP generated - if the problem is solved well, for re-use and relax a little bit with the time and motion study. Especially if you buy into the idea that technical development could actually *shock horror, be a creative endeavour and not the answer to a math paper.
A big problem is that developers estimate according to the manager/client expectations. If the manager/client says something like "do your estimation, you are the expert but I'd say that in X months this should be done", this X number heavily affects the estimation. Maybe subconsciously but the developer will try to adjust to that number.

See this paper (not mine! :-) ) "Anchoring and Adjustment in Software Estimation " by J. Aranda and S. Easterbrook in ESEC/FSE-13 Proceedings of the 10th European software engineering conference held jointly with 13th ACM SIGSOFT.

I've always gone with it being done when it's ready, trying to cripple a project from the outset with artificial dates pulled out of thin air based on a process which itself is a discovery of what is actually wanted is utterly pointless.

Software is a process not a product.

I like to try to illustrate the problem of estimating for development by asking the question "how long does it take to solve a problem?"

I find it's possible to produce estimates by thoughtful analysis of the requirements and (as BryanGislason suggests above) by research--indeed, I think this is the only method that even has a chance--but such estimates necessarily suffer from the fact that the only parts of a problem whose duration *might* be reliably estimated are those parts whose solutions are already known.

If there's any part of the project that is unknown or unforseen (or unforseeable) the estimates go out the window and you're back to asking "how long does it take to solve a problem?"

An essential issue here is in establishing the kind of deadline that is being generated from the estimate. Most deadlines are related to a business plan, or are part of a motivational tool used by management. Only a very few deadlines are truly based on estimates --- usually those are from the early stages of planning the business.

While managers are often dim witted, they are made to look much more so when neither they nor their subordinates understand the purpose of a given estimate.

In reality, it is not that hard to make an estimate for how long a project is going to take, and it is even easier to offer updates that adjust the estimate. Well, unless you have legacy code with which to interface. Well, it is still possible to make accurate estimates, but no one seems happy with 'never''...

@Chris Kuszmaul

You're language shows a lot of the problem :

"their subordinates"

The single main reason for a "manager" (for want of a better term) tries to get a estimate of work is not to plan the project or manage the time lines. It's as a weapon to threaten development with, with the notion of "you're a failure if you can hit this dead line". If it was accurate it could be used for managing customer expectation and time lines, though it's rarely gotten with that motivation.

Which is why arbitrary short "dead" lines are pulled out of thin air, to manipulate and abuse development teams.

I've yet to meet anyone in project and/or management who hasn't been a developer who can actually understand, that a lot of creating software is finding out what the client/project actually wants, which effects how it goes.

It's typically driven by weak and inept management (which is by far the norm unfortunately) who don't understand how technology is created. i.e. a process not a product akin to a car.

I've always focused on releasing software when it's done - not when some suit put a flag in a calendar.

Sorry, I guess I should have said 'individual contributor'
Web development in particular can be inherently complex in terms of the constantly changing tools, technologies, and platform constraints that factor in to the planning, design, testing, and implementation of modern applications.

While experience certainly plays a role in the accuracy of time estimation, to a certain extent estimating is just educated guessing. Put another way, time estimates are approximations based on past experiences solving problems with a similar set of criteria.
In some cases if the project deliverable(s) delve into uncharted territory where the developer has little to no previous experience, a smart developer will use the opportunity to learn and build their knowledge set. A very clever developer will not usually provide a time estimate on the spot if pressed. Rather he/she will go away and perform rudimentary research or build a series of quick and dirty test mock-ups to better gauge the time/resource requirements before formally providing a quote.

There is lot of pressure to estimate low. Some it is direct and obvious such as when managers push for early release dates. Other times the pressure is subtle such as when managers stress how important the project is and how much the company needs the product.

We engage in various estimating techniques to try and arrive at accurate numbers yet the pressure to estimate low is always present. That's why ranges are better then single numbers. Give a low and a high estimate with the goal of landing in the middle.

Also, always track actual results against the estimates. Once you've accumulated enough data to gauge yourself, your estimates will improve.

One thing I like to see in estimate discussions is some discussion of the uncertainty. In science, a point measurement gives you zero information unless you have some idea of the uncertainty of the measurement. Similarly with estimates. Whenever estimating, I look for at least a qualitative L/M/H rating, with some explanation of why. This goes a long way towards understanding the level of confidence in the original number.
I'm starting to write pseudocode inside a hierarchical to-do list before I start working, it's called Getting Things Gnome. I think it has a few bugs but it's really a useful app. PS: I think Brian Tracy has some interesting ideas about to-do lists.
Wonderful discussion, thanks Ashley and others.

I've been a developer, a mid-level manager (yeah, pointy hair and all), and an entrepreneur, but my heart is in development.

We need schedules.

We need them so our colleagues in other parts of our businesses can do their jobs right. It hurts when an executive demos something totally broken to the trade press, or when a sales person bets the company by promising a ridiculous delivery date.

We need them to do our own jobs right too. I know I've spent time polishing software modules waiting for other people (sometimes contractors) to do their parts, when it would have been more fun and efficient to keep moving ahead.

My electronics prof used to say, "Any clod can make a gadget for a dollar. it takes an engineer to make them for a quarter." In software, where our costs are simply our paychecks, that means "Any fool can get something done in a month. It takes an engineer to get it done in a week."

We're in an optimistic trade. I've seen dozens of schedules with an end date that was defined as "the earliest date for which there is a non-zero probability of being done." That's a painful, weekend-eating, scheduling method. It also has a huge problem during the software end game. "How long to fix that systems bug?" "I dunno." "You have to tell me!" "I dunno." "OK, I'll put down two days." "Whatever."

But we live on optimism: after all we're inventing things, and sometimes there's a big payoff, like getting into orbit or going public.

Scheduling needs to be more than "when is that thing going to be done?" "I dunno, four weeks?" "How about two?" It needs to be probabilistic. It doesn't matter how long each item takes, any more than it matters how tall each person is. What matters is that when we handle all the tasks, they average out to something predictable.

Most work can average out. Of course, there's always a black swan: an abnormal thing that doesn't average out. For those things there needs to be risk management. Ashley quoted De Marco saying that risk management is scheduling for grownups. De Marco is right, but it's a BAD way saying it. It implies that most of us are children, and only mommy and daddy can handle risk management.

Risk is real. A friend worked on a project where all the prototypes were lost when a freight airplane went off the end of a runway and caught fire. The customer cancelled the project and my friend was laid off. But that's not the kind of risk I'm talking about. I'm talking about "this interface isn't documented right" risk, and "that algorithm might not be fast enough" risk.

Of course we can handle risk management. Keep track of your own risky stuff. If you use JIRA, create risk tickets and review them weekly. Tackle risky development work sooner. Talk about the risky stuff in scrums. Let your colleagues (executives, sales people, even customers sometimes) know the risks.

That way we can get rid of the the "only for grownups" scheduling methodology that plagues our business.

PS. Read The Checklist Manifesto by Atul Gawande, especially the chapter on how the construction industry handles risks. If you're up for a screed by a mathematician on risks, read The Black Swan by Nassim Taleb.
@ JoeBackward Thanks for your reply, worthy of a blog post in their own right! I have always assumed - hoped - that De Marco is suggesting not that project management is reserved for a few people, but that as an industry we need to mature, and that we _can_ grow up. Everyone needs to understand risk and variation, everyone needs a way to to gauge how their decisions impact the economics of the business, etc.

I think a lot the knowledge of how to do this already exists - other commenters here have also thrown in some great references and ideas. But getting everyone on board and working together harmoniously is a difficult problem. My optimism is not for entering orbit, it's for the more modest goal of improving the quality of all our working lives.

PS. As it happens, I already have both those books on my to-read list - but I'll push them higher up after your recommendation :-)

My "getting into orbit" remark comes from the historical factoid that the Program Evaluation and Review Technique (a/k/a Critical Path Method scheduling, the technique built into MS Project) was invented for the Mercury space program.  Scheduling is, in fact, rocket science, even when we apply it to excellent earth-bound goals like making work more fun.
We typically work on new stuff all the time so estimating is difficult. I often find estimating time for new technology is as difficult as estimating math homework. x=2y+5 is easy, then comes x^2=2y+5-z and all the sudden you are scrambling to figure it out.

Government contracts make it worse. They want custom software sold like its off the shelf. To CYA you have to over estimate. I wish government would conform to agile methods.

I envision a future where a large contract comes out and people bid to be the primary developers. The project is then broken down into 2 week / 1 month cycles. Each cycle you report progress. If the government doesn't like it they can switch rather than holding million/billion dollar contracts and refusing to pay at the last minute....

Good article!

Good article. Keep in mind that the guy saying 'my client is breathing down my neck, I need to know if it's done in three weeks' is usually making shit up as well.
I have always estimated at what I believe is the 70% probability on what I can deliver. There are some folks with whom I have worked that would take my estimates and cut them in half. These folks could not deliver that same project at all, but had the audacity to believe that I would work 60-80 hours per week to meet an unrealistic schedule. They failed to realize that at least in my line of work each project took at least 2-3 weeks of research into standards, means, and methods before I could write the essential parts of code needed to make the project a reality. I worked normally about 50-60 hours every week because I would spend 5-7 hours coding, 2-3 hours debugging and 2 hours or more on support information.

None of this includes reporting, timesheets or other "required" support paper work to help others understand what I was doing with my time.

Of course this was not Web development, but development of a much more technical nature, so I am not sure how it matches your time/experience/needs. Suffice it to say that when the customer received a modified quote I would tell them that those were not the numbers I gave to management. I may never work for any of those companies, but I have been honest to the ones I did work for, and they always got the very best I could do.

If you give me bad specs, poor information, and then abuse me, you cannot possibly expect good work. GIGO still applies.

By far the most common reason for overruns, in my oppinion, is the practice of Project Managers to pressure programmers to reduce estimates. This then induces panic, which then degrades quality, which extends the work. Just why some Project Managers think this is a good thing is a deep mystery. I think there must be a special qualification to becoming such a Project Manager "Must be a moron".
@ Adrian - Re pressure: You may be interested in another of Tom De Marco's books, Slack. He describes a thought experiment where you can apply pressure to a development team using a lever. The question is "how to use the lever?", and he says the current folklore is to pull it all they way down- as if managers believe that productivity will level off, and not substantially degrade. Many other great ideas in this book too.

http://www.goodreads.com/book/show/123715.Slack
http://www.amazon.com/Slack-Getting-Burnout-Busywork-Efficiency/dp/0767907698/

Thanks for the investment in the article. I estimated it would take about 30 seconds to re-use and forward to my boss and I was right. If I'd had to write it myself...
One of the worst aspects is the current adoption of Agile. Now may I qualify this. In true Agile the team decide on what can be achieved in the period of the next Scrum, and also decide how, and thus quality, with a view to the project as a whole. In Corrupt Agile there is a Project Manager making this decision, often pressursing programmers to take short cuts. workrounds and bodges, just to achieve some completely artificial deadline. This almost always means the work has to be done again, and again. I think Agile, applied corectly, recognises and structures the flexible approach necessary in many projects but applied wrongly, is pure poison.
@ Bill - Glad you found it useful! Forwarding and reusing is positively encouraged, I intend to make this part as repeatable as possible :-) The value your boss gets from it is still a research activity though, so if you get chance to report back, I'd be interested to hear about the result (as I'm sure others would too).
<s a topic missing which is teaching non-technical managers (or managers of any activitywhich produces a process) about estimates, their sources of error and gauging the potential size of the error bars. In general having realistic expectations. In particular contrasing these estimation errors with the simulataneous desire to fix all three of the resources, schedule and content.
Most people can not estimate time, and this stems, at least partially, from optimism.

Fifteen (15) years ago, while working on a small project for SW upgrades, I learned to double my own estimates of time. Of course, my caution in estimating wound up being twice the time required, but it is still true that one needs t0 double one's estimate, particularly for longer projects.

The problem with estimation stems from several issues:

- Optimism (this is likely the biggest problem)
- Multiple concurrent projects
- The subsequent loss to multi-tasking
- Project scope and creep
- Good design takes longer up from but can save time on the back-end

After 35 years in software development, I learned a few things about estimating a project.

Rule #1 - The first number you give a client is THE NUMBER that they will remember, regardless of how specifications, staffing, budgets, etc. might change. If you say 3 weeks, THAT is the number they will remember and hound you with, even a year down the road.

Rule #2 - No project specification is ever complete. The client will ALWAYS have change orders. "Can't you just move it over to the left a couple of spaces? Why is that so hard?"

Rule #3 - No project can ever allow for the unexpected. "Gee, it took 6 weeks to find the bug in that commercial package?" "Yes, I know our server center burnt down, but we still expect the project to be completed on time!.

Rule #4 - The client will always assign his lowest level, most incompetent employees to the project.

Software development is not manufacturing and no matter how much we want to change it to assembly line programming... its not going to happen. I understand people want to commoditize development - non-technical people hate technical people earning good salaries for work they can't measure in a spreadsheet... but that is what Software Development is - its creative, thinking, analysis - not put part A to B and tighten. Please ream Peopleware - the best book ever explaining the best way to gain efficent development along with more accurate estimates and higher success rates is to build great teams and let them do what they do best.
Another pressure to give a low estimate is that the project *might* take less time and you *might* lose the customer if your estimate scares the customer away...
Schedule guesses aren't normally distributed. It's more like a poisson or exponential distribution. The chance of being done earlier than the mean estimate is quite small, whereas the chance of going over is significant. If you figure a project will take about 10,000 lines of code, the chance of it taking much less than 7 or 8,000 is negligible. The chance of it being bigger is substantial. That's why we never finish early.

One reason we don't estimate correctly is that we ask the wrong questions. You can ask an engineer how long to write a feature, and he may say 3 days. But you forgot to ask him, "How long to write the feature, while attending staff meeting, going to the dentist, and working of that important fix to last year's code." Fred Brooks noted way back in 1969 that developers only spent about 19% of their day developing. I've improved my scheduling accuracy by asking how long to write a feature, and then derating the estimate for non-productive time..

Another point here, which is being obviously missed by the masses of suits who want to find new condescending ways to blame developers is that the question isn't the title of the blog it's .........

"Why isn't software perfectly defined before it's built"

.......obviously as it can't be, because most of development is exploratory, prototyping, trying something then leaving it once the customer or environment moves the goal posts again.

The subject of this blog post is about how to shift risk management and the ability to be flexible (which means the suits have to work) onto developers in the form of guilt and pressure.

It's sad and Dickensian and far too common.

Jeremy is right on... its never the management who doesn't have a clue what they want (just when they want it) fault why a project is behind, etc. I remember a few years back in an enhancement estimation meeting with a project manager who was trying to talk us down on our hours of work per task (Company was adopting the Agile process). Well, it adventually came out all the project managers had already given their estimations for enhancements to out products to executive management - long before meeting with the worker bees to see when the work could get done - and where under great pressure to sell us on their estimations. Yeah, it was the developers fault why every project was 3 months behind by April.... and pigs can fly too... wanna buy a bridge from me in Brooklyn?
Oh, and by the way, developers *can* estimate schedules accurately. If they're given the resources.

Think about accountants. Every decent-sized company has a staff of accountants who estimate and measure financial performance. They usually can get within a couple of percent at a horizon of 1 year. How do they do it?

Well, first off the company hires specialized professionals and dedicates them to estimating and measuring financial performance. They aren't people who normally do a different job (like writing code), and only estimate schedules as a side-line. Accountants normally form an independent department under the CFO or Director of Finance, who reports directly to the CEO. They don't work for a chain of managers with an interest in making the numbers look a particular way. The accountants have the power to regulate spending to meet the estimates. And an external organization audits their performance.

If an organization hired specialist schedule estimators, and if the schedulers were independent, and if the schedulers could regulate resources, and if the accuracy of scheduling were reviewed by an external organization, I just betcha we could get within a few percent on software schedules too. What do you think?

@Kurt

Accounts and finance is very mathematical (naturally) and are built on formulas that are by nature dynamic to their environment - numbers in numbers out.

Software people get the equivalent of :

"How long is it going to take to build this abstract thing that we're still trying to figure out?"

"Hmmm, I can't answer that"

"Oh, then you are useless at estimating, this is all your fault".

"It's round and made of metal, like a rubbish bin"

"Err OK, how about 3 months"

........ 2 months later ......

"OK, the client has confirmed now, it's a Saturn 5 rocket, you have 4 weeks left"

-------

To that effect every response to the usual demands for time lines, should be demands for information.

You want good dates, get good information. Yes you need experience and skill in developing software, though all good decisions in buisness are made on good information (talk to any metrics or marketing person).

Trying to do the same in software, with no good information makes no sense.

Software people don't need to get (that much) better at estimating as of yet, buisness needs to understand what it's asking for ........ OR know that the first % of a project is really figuring that project out and at that time nobody knows how long it's going to take.

When we were building vBulletin 3 we had no idea when it was going to be "done" as 1/3 of the project was making sure we knew what we wanted and how to do it ...... as we got moving we had a much better idea of the end date, as we had a much better idea of scope.

What kept quality up and stopped the scuppering of the project, was not setting an arbitrary date at the beginning that would of forced substandard software and taken on a huge amount of technical debt (which would of been blamed on the engineers ......). The XenForo guys did just the same ......... as I do on my projects, take the customer on a journey don't limit everyone from day one with information you don't have.

Many studies have been done as to why schedules are not correct and projects fail.
The major issues is has the lead done this kind of project before and has the team done this kind of project before. If yes to either lead or team then you can at least have some hope of being within the old norm of doubling the estimate and some home that the project will succeed. If both then accuracy and success are pretty obtainable. If neither, you have nothing more than a wild ass guess and very little likelyhood of success.

Add to the above than more and more if projects are not managed by the engineering staff (not really) and any hope of reuse, design, architecture, etc. are hopeless as management will never agree to the time it would take.

So even if you have a reasonability competent team, they will just be flung from one death march to another. If the tech lead is mediocre (and with mgmt going for the least expensive you can guess) then you just move from disaster to disaster.

Mgmt is NOT interested in good engineering. Just features to sell and attractiveness of web site.

A big part of the problem is being able to communicate to the dev team, what features/user stories are allowed to be "minimal / cheap", which ones have the highest uncertainty (and thus deserve the least refinement of the design until such uncertainty is reduced by new data - customer input, market input, etc.).

I have run into too many times where a team member gets intrigued by a "problem" that isn't really a problem that needs to be solved (or maybe, it doesn't need to be solved right now).

The best remedy I have found for that is to insist on a development plan that is reviewed for priority and mutually accepted by all team members. This allows all members to keep on-track and focused on the "high-value development". The caveats are a) the plan must have vagueness equal to the requirement uncertainty, b) all parties must understand that the plan is not set in stone. New data can and will change it. When new data comes along, the choice is stick to the old plan, or change. The relative merits of each decision can be weighed and decided upon.

@ Ralph Moses

Rule #1 - The first number you give a client is THE NUMBER that they will remember.

.......I can certainly attest to this. Our project changed hands twice; the first time a 3 month estimate became a 12 month delivery (and even then it was a pile of ****); the second time a new team estimated 2 months which eventually became 7. Details aside - and there are plenty - we were more skeptical the second time around, but still the original estimates (3 and 2 months respectively) were the stick with which we beat the developers heads (for better or worse).

Rule #2 - No project specification is ever complete. The client will ALWAYS have change orders.

......yes, guilty as charged. However, there is a balance to strike between allowing the developer to use their creativity and knowledge to implement 'better' solutions, and letting them deviate on a whim from the initial scope. Some changes arise from these 'better' solutions which sometimes challenge the original idea. Having said that, the "move it a bit over there" condition is one we've had to grow out of.

Rule #4 - The client will always assign his lowest level, most incompetent employees to the project.

........quite harsh perhaps, as I must be that employee. The client, as such, is typically non-technical and so deligates [ideally] to the most technically able person suitable for the assignation. It is difficult to find someone who can move seamlessly between dev and management whilst maintaining a delicate control over the project and keeping everyone (I hesitate to say "happy") at ease.

If we've learned anything it is this: not all companies and developers are the same, and many are not [currently] equipped to deliver the solution the client wants. Some will provide estimates which are unrealistic and based on some/all a) inexperience b) ambition c) finger-in-the-air-to-please-potential-client. A good developer/team will provide an estimate and be able/willing to break this down to explain (the method will have been vice-versa of course).

Take an estimate, certainly double it, perhaps double again, then re-visit your overall strategy based on this new figure and take a close look at the people with whom you're entrusting your future business.

The key to a legitimate estimate is to work from specific requirements. This is especially true when working with off-shore developers. Once everything is spelled out, the workflow falls in place.
I agree...
In the morning it's like "heh... done in 15 minutes!"
and then you start, and it's noon.

I think developers should use a SYSTEM for estimating.
I came up with a system of my own, which is explained here:
http://www.nurne.com/2011/02/todo-doing-done.html

I use it all the time, for everything I do, and it works perfect.
It drives you to keep DOING, shows you what you still need TODO and gives you motivation by listing the DONE tasks.

For really learning from this, I use the *timestamps* for each task :
"create and set favicon [ 13:10 - 13:45 = 00:35 ]"
the next time you need to create and set a favicon, you can say it'll take 30-40 minutes.

Thank you!
Nur

Actually, software estimates are wrong for the same reason most estimates are wrong.
http://deathrayresearch.tumblr.com/post/4503505772/the-pathology-of-estimates
Not every project requires something completely new, previously unknown, which can't be estimated. Quite the opposite, in fact. Most complex projects can be divided into smaller tasks, which can be then individually estimated and added up. For example, I know that creating an average javascript widget/interaction can take one day, styling the widget can be half a day, integration with other widgets one day, etc. Now if I can estimate that a highly complex web application could contain one hundred (or whatever) of such widgets, then I can arrive at a reasonable estimate.

In my experience I divide complex projects into areas of work, and then subdivide more until I reach a set of 1-2 day tasks.

hey folks,
this is one of the stuff I have to face daily. How can I tell manager about finding a new method that I also dont know
The rule of thumb "take the estimates of a developer, double it and add a bit" is probably the cause of consistently having Parkinson's law in action on every single project. I wish project managers stop thinking this way.
Can't agree with you more!
It's absolutely true. I'm a developer and I can't estimate even simple tasks because I want to make each task perfectly, not quickly.
Here's a thought:

If one were to reduce the steps required for the analysis and development of a specific task to algorithmic steps performed by a Turing machine, then the problem of estimating how long it would take (and even if it completes at all), for the general case of any task, is reduced to the Halting Problem.

This is, therefore, unsolvable and any deterministic estimation strategies one may employ are just heuristics that may work better or worse for specific domain constraints, but offer no real insight if applied to the general case, or a new, random task that one has never done before and has to solve without relying on some precedent.

Hello Ash, I wrote an article (in italian language) inspired by your post. I mentioned you. I will check your posts to find other inspirations
Thanks

Here the article:
http://i3factory.com/it/stimare-il-tempo-di-sviluppo-software-come-per-app-pe...

We offer finish web design alternatives, where our customers are gained to have all mutual alternatives under a single outdoor umbrella. Our website developing group is able of developing top quality, eye-catching, cost-effective & expert web alternatives to our customers.
We offer finish web design alternatives, where our customers are gained to have all mutual alternatives under a single outdoor umbrella. Our website developing group is able of developing top quality, eye-catching, cost-effective & expert web alternatives to our customers.
1
1
2
3
dasdsa
My "getting into orbit" remark comes from the historical factoid that the Program Evaluation and Review Technique (a/k/a Critical Path Method scheduling, the technique built into MS Project) was invented for the Mercury space program. Scheduling is, in fact, rocket science, even when we apply it to excellent earth-bound goals like making work more fun.
Several web development India companies specialize in

different almost every type of website. This means that no matter what type of company is yours and what type of

products or services you are catering, the Indian professionals are able to deliver highest quality services to the clients.

The Indian engineers possess high quality technical and domain expertise, thus giving you not just cost saving but also

quality services.

The key to a legitimate estimate is to work from specific requirements. This is especially true when working with off-shore developers. Once everything is spelled out, the workflow falls in place.
Back in the day, I was at a few companies where we made estimates using best, expected, worst cases. This allows a developer to account for risk. Additionally, there are statistical measures now on the state of software regarding quality that can help with your fudge factor. Bottom line, PMs REALLY pass the buck onto developers here and just want the developer to pull this complex analysis out of his ear because it's way easier to blame the developer than take the blame themselves if the estimates are wrong.
This is the dumbest thing I've ever heard. You actually take a developers estimate and double it? Do you assume your developer is an idiot? Developers should know how to estimate their own projects and THEY are the most qualified person to add any padding not YOU their manager. I would find a new job if I found out my manager was taking my estimates and doubling them for no reason. A developer coming late is almost always due to excessive pressure from management, scope creep, and incomplete or just wrong requirements, not because developers estimates need to be arbitrarily doubled because they are incapable of doing estimation accurately. My estimates, to date, have always been accurate within the range specified, and I continually reassess and adjust as knowledge is gain. Knowledge is power and maybe you should gain some instead of being an idiot. Yes, there are developers who don't know how to estimate, but arbitrarily doubling isn't going to really help you get done if they are stupid... you'll be lucky to get done at all if they're stupid.
Derek - thanks for your thoughtful reply. Nobody has taken so much effort to call me an idiot before, so you must really care about this. May I draw your attention to the paragraph immediately following the well-known "rule of thumb", where I explain some of the costs of this strategy, which is why I don't do it myself. While I didn't state it in the article, two other facts may clarify this: (1) I'm a developer myself, so I understand estimation from a technical perspective, and (2) lacking any empirical data, the most I accuracy I ever ask for myself is "hours, days or weeks?", because I don't believe that individual task estimates are valuable, or only rarely so. It's also worth pointing out that padding estimates should _not_ be the responsibility of individual developers. Padding estimates (assuming estimates are being used to set a delivery date) is trading speed of delivery for confidence of delivery by a certain date, eg taking 50% confidence of delivery in 2 weeks for 80% confidence in 2 weeks. This is a project-level concern, not a task-level concern. If estimates are individually padded, and those estimates are used to track progress, anything that causes individual task times to increase puts the chance of meeting the overall project date at serious risk. As I mentioned, this is the reason Critical Chain Project Management adds the padding for the entire project at the end. I don't see how this can easily be applied to software though, which is why I prefer to collect data and make statistical predictions instead.
Very insightful!!! May I translate your article to Traditional Chinese and share it for educational purposes? Thank you.
Hi Willard Yes by all means you my translate the article. If you let me know where it is published, I will update this post with a link. Thanks!
7 visitors upvoted this post.