The Myth of "It Should Be Simple"

I've spent a lot of time over the past few years reading forum posts and submissions to tell_fs@microsoft.com. One thing that consistently catches my attention is the assertion that whatever is being requested "should be simple" to do. I considered providing examples from the past month or so but there were just too many.

I think what people really mean is "I hope it's simple to do because then maybe Microsoft will do it." At least I hope this what they mean because if they really believe it's simple they are setting themselves up to be disappointed. The fact is I've never seen a suggestion anywhere that I'd judge as "simple" to do.

Why? The answer to that question is simple: we develop software. Every idea or feature we consider has to be looked at with respect to standard phases of software development: design, engineering, testing and support. (Various software methodologies may use different terms but the concepts are the same.) Below I've listed the questions we ask when considering any new idea or feature.

Design: What exactly is the feature? How should it work? I see a lot of single-sentence suggestions like "it should be simple for ATC to issue speed restrictions to aircraft." There are a whole host of issues lurking behind that request. When should speed restrictions be used? Always? Only during heavy periods? How should speeds be calculated? Based on aircraft type? Model? A look-up table? How will it work with 3rd-party aircraft that haven't been developed yet? What if users can't maintain the assigned speed? What if they ignore the instruction? Should ATC cancel their clearence? Try to adjust the flow for other aircraft? And finally, what value does this feature bring to the different customers that use the product?

Engineering: Somebody's got to code this feature. How long will it take? Does it impact other systems? That's a question that often delivers a quick death to feature requests because an idea that seems simple reachs deep into other systems, requiring changes there. This gets to the question of risk. If you have one system that works perfectly well would changing it to accomodate a new feature risk breaking something? What about content? Does the feature recquire new assets such as art or audio? What about UI? How will the user interface with the feature? Does it need a menu option? A new dialog? For every single feature we develop we have to boil the answers to all these questions down to the notion of cost (in person-days, cash outlays and risk) and weigh it against the value computed during the design phase.

Testing: Everything's got to be tested. We use a rigourous testing methodology at Microsoft. That means we don't just mess around with the software waiting for it to break. It means that every feature or feature area has to have a test plan and test cases. The test plan outlines the expected results that we believe will yield a quality experience. What is the user experience for this feature? What are the parameters and their allowable ranges? What happens if the user tries to use the feature outside its designed range? The test team then needs to create test cases that target these scenarios and yield a clear pass/fail result. Creating a test case takes time for design, implemetation and execution. Plus, the more you test the more bugs you find; the more bugs you find the more you need to touch the code to fix it; the more you do that, the greater the risk of breaking something else.

Support: Supporting a new feature encompasses both educating users about it and answering their questions when they get stuck. How much documentation does the feature need? (Plus the cost of any new document is multiplied by the number of langauges we need to translate it into.) Is it likely to generate support calls? (If so, go back to the design to see if the feature is well understood.) What about future versions? When will the feature likely be re-written or dropped do to major changes in the product?

I remember watching a Frontline episode before the 2004 US Presidential elections that compared the two candidates lives and careers. It featured interviews with may of the candidates' friends and colleagues. John Kerry's former chief counsel, Jack Blum, described him as someone who would "tell you that the problem is much more complicated than you think it is." Software development is like that. The approach didn't serve Kerry very well but we don't have the choice of expressing things in simple, black and white terms. So keep those suggestions coming but, if you don't mind, let us decide if they're really simple.