I like the idea of practicing, in the musical or athletic sense, at professional skills to rapidly improve my performance and reduce my error rate. When I was a music major, I spent hours practicing ahead of rehearsals, lessons, and performance. Until recently, I was unable to conceive of how I might do the same for leadership.
Estimating the time and effort of a software project is error-prone, even when one considers how error-prone the activity is. Reasoning about a clearly defined functional scope is a classic "unreasonably difficult even when accounting for how difficult it is" problem. Everyone wants to know how many days a project will take, but few try to understand the myriad, interconnected ways the number of days can grow surprisingly large. (More on that later.)
What I figured out was, practicing at estimates is as “repeatable and atomic” (see below) as scales are for music. The material one needs as input to practice at estimating is available on the marketing websites of the numerous startups, technology, and software product companies out in the world. Any number of feature or pricing pages are sufficiently interesting to ask myself, "how would I start building the functionality or feature advertised here?” Then, do a 15-60 minute exercise of breaking it down, thinking about scenarios, discovering risks and dependencies, laying out a plan, and even thinking out the high-level technical design.
This is even easier if I start from products I already know well from daily use. Most developers know GitHub's features pretty well. Starting from their Pull Requests marketing page is a familiar starting point. This lets one work on the mechanics of thinking holistically about the functional scope. Then, I can get down to details about how to build a (hypothetical) wholly new capability or feature into a (hypothetical) product.
This insight allowed me to move past a "I guess estimating is a struggle bus that could limit my growth" mindset. Now I feel like "I can get as good as I want at this by doing the reps", much like practicing at music or free throws.
The recipe for practicing at estimating
- Think up some kind of project I might do in the future or
- Find a software product page on the web and use that as an imaginary requirements document
- Spend 20 minutes going through the motions of how I like to estimate projects (break down scopes, think through happy/edge cases)
- Spend 10 minutes breaking down the work, identifying the risks, etc. – use an outline, sticky notes, a whiteboard, whatever works
- Reflect on what I came up with, how I’d tackle it differently next time
- Rinse and repeat. Take a different approach to the same project. Estimate a different project. Try the exercise with a few of my favorite colleagues/collaborators and see how it comes out differently.
The most important thing to keep in mind: formality is not required when practicing. It’s about low-stakes and fast iteration.
Practice makes estimating less imperfect
Practice can make software estimates more useful. Not more prescient, necessarily. But definitely, I can produce better outcomes by practicing around the process of planning and estimating software.
Practice works because it is atomic and repeatable. I’m using database terminology a little oddly here.
Atomic, in the sense of database transactions, means I can "return to square one" at my discretion without imposing on anyone else. To carry the metaphor, practice sessions are more atomic than rehearsals or performances. I can throw away the results of my estimation practice session without ruining anyone’s project/roadmap/schedule.
Repeatable means I can estimate similar projects over and over again until I’m satisfied I’ve reached the level of skill mastery necessary. To carry the metaphor, I practice a passage or shoot free throws until I’m confident I can perform it correctly the vast majority of the time.
The great thing about practicing at estimating fake projects is it will make me better at estimating real projects. Like they say: practice makes perfect.
Equally great, I’m not accountable for fake projects. I don’t have to worry about breaking things down incoherently or overlooking a key bit of functionality. Hopefully, I find it upon review and reach a state of coherent, complete estimates. But the risk of one individual practice estimate ruining someone’s working week is non-existent.
The web is rife with practice material. Grab a short product description, pitch, etc. Marketing and feature pages on a favorite app’s website work great. Previous projects or imagined software work too. Maybe jot down prior assumptions, e.g., I’ve already built authentication or a mobile app or payments.
With a “practice” functional scope and some constraints in mind, it’s time to start practicing. Ask where the seams are, how to split things up, what are the risks, could this part be omitted, do we have prior art for this, etc. Make a list or board or mind-map or whatever. Organize it by concept or how I would tackle the project. Look for gaps in how I would explain the plan or pitch it to my team. These all work well for me, but you may estimate differently. That’s fine, and the point! Try different approaches and see what is effective, efficient, and generative.
I can practice at taking the idea for a software project, feature, enhancement, etc. and turning it into a plan for how to build it. When I practice at this, I improve at finding risks, breaking down problems, writing down ideas, coming up with novel approaches, considering how to apply technology to solve problems, deciding which parts of the problem to focus on, etc. All of this yields better plans for building the project, which leads to better execution. Somewhere in the middle of those plans come estimation, which I’m getting more reps at, so hopefully I’m getting better at it.
Planning and estimating software forces me to turn over stones I may otherwise overlook. It gives me an opportunity to tackle kinds of software I may not otherwise build, or even learn how to build software I may not otherwise build at a high level. For example, I have no idea how to build a game, but practicing estimating a game project sends me down a discovery process that will certainly teach me something.
Rinse and repeat. Do this over and over until you are as good as you want to be at estimating software. Maybe do this with your team. It’s a little harder, but teaching everyone to do it will raise all ships.
A crucial element of practice is immediate feedback; “how’d I do?”. Practicing at estimation gives me one kind of immediate feedback in the form of revealing more of the puzzle as I go. On the other hand, I (still) don’t get the crucial feedback that makes software estimation so difficult in the first place — measurement of accuracy and precision. Software estimation is hard because a lot of the factors that go into how long building software takes are invisible and unpredictable.
Reflections on practicing at estimating software projects
So, I found it possible to quickly gain experience at the “guessing at numbers” part of estimation. This bit is amenable to iteration – I can work through a stack of software project ideas, call upon my experience, and guess at what the tasks are and how long they might take. I did several of these over the course of a couple of weeks. I found it effective enough, I no longer felt like I was “working from behind” when called upon to estimate how the effort and time for a project at work.
Lately, I’m using Jacob Kaplan-Moss’ approach to estimation. List out the tasks, score them by effort and risk/variance. Add up the numbers and get a sum. Decide that number is fine, or investigate the critical tasks (large effort or high variance) to get a better idea how to break them into less risky tasks.
Practice makes challenging activities something you can rinse and repeat.
Key caution: this kind of iterative improvement is not the same kind of activity as Platonic practice activities like shooting free throws or methodically learning a piece of music. To use a coarse analogy, practicing at estimating software is like practicing free throws, but I can’t see or hear the ball after I release it. There’s no feedback loop, I can’t know if the shot went in, bounced off the backboard, or missed entirely. In other words, everything that happens after the initial estimate is the wicked problem that defies both practice and systemic, industry-wide improvement.
On the other hand, this form of the activity does look like basketball or musical practice because it’s in a safe bubble. My practice at estimating doesn’t put real projects with real people working them at risk. If I try something unusual, no one has to put up with it later. This affords trying multiple approaches to the same problem and experimentation. I find this is the key insight – for some non-coding leadership/management activities, it’s possible to practice at some part of the activity and gain confidence in running that part. Possibly, not the whole thing, but at least a slice of the process is improvable without relying on live people and live work for non-methodical iteration and experimentation.
In short…
Estimating software projects is a relatively easy, learnable planning/brainstorming creative task coupled with wicked expectations of the ability to predict the future in terms of outcomes, unknowns, risks, and things that just don’t go my way. I can practice at the first part.
No software estimate is perfect. The world is full of surprises. Focus on the easy part, decomposition and discovery. Tackle all the execution stuff downstream separately.
- This thinking was inspired by an Andy Matuschak Patreon post on practicing at sight reading on piano ↩
- Double bass performance, orchestral, first year. I switched because it was far easier for me to spend six hours in front of a computer than in a practice room. Caveat raptor. ↩
- Even multiplying estimates by two is considered a pretty-okay practice, but is often insufficient for a good, let alone reasonable, answer. ↩
- Which I have done very little practice at and, accordingly, am not very good at. ↩
- Whether a marketing page is a better product spec than what you receive for normal projects in your org is a matter of variance. ↩
- Caveat: this isn’t intense concert pianist/pro athlete level practice. If you do end up producing >99% perfect estimates, please all of us know your secret! ↩
- It may prove that what I learn is how little I know about actually building game software! ↩
- It’s perhaps not that bad. Perhaps it’s more like the difference between shooting free throws and scoring field goals in a basketball game. I have no hope at the latter if I can’t perform the former at a high level. ↩
- Sorry ‘bout that! ↩