It's so true.
I'm a software engineer, and one of the things I have to do is provide SLOC (Source Lines of Code) estimates for system enhancements. It's largely bullshit before you get into the high level design phase, but we give it our best estimate anyways.. And then the managers put an uncertainty factor on it (usually start out at 2.0, then reduces over time).
At the end of development, if our final SLOC total is dead on, or say, 10% below or above what we estimated at the start, do you know what they ask us? "Why were you so far off from your original estimate of *original figure multiplied by 2.0*"
As engineers we're taught to consider the "worst-case scenario," so that is already reflected in our estimates. the lines of code total blows up when we accidentally overlook something major in our initial estimates. Generally if we identify all the things we have to address, our estimates will be higher than the final total, or close to it, assuming we overestimate a bit. So the 2.0 factor is a buffer for "gotchas", but management treats it like the "actual" estimate we make.