The Fundamental Tradeoffs
Updated : January 5, 2004
Jesper M. Johansson
Security Program Manager
See other Security Management columns.
Welcome to the Security Management column on Microsoft TechNet. This is a new column dedicated to the administration of system and network security. While the column does not have a set schedule, you can expect to see a new column published every month or two. The goal is to show you how to improve the security of your network and help you understand the kinds of security problems that are commonly encountered. It is also about how you can improve your security posture while maintaining the high level of availability that your users and management have come to expect. Information security (infosec), of which network security is a fundamental part, is a process. It is not a finite state and it is not something you can turn on and be done with. Infosec is a precious baby that needs a lot of tender loving care, cuddling, and the occasional wiping down. A part of the focus for this column series is about the ongoing work that you need to perform to improve the security of your network. In this first article, I will outline the fundamental tradeoffs that need to be considered to better protect networks.
On This Page
The Fundamental Tradeoff
The Fundamental Tradeoff
Before I came to Microsoft I spent 12 years administering networks of varying sizes in a part- or full-time capacity. Throughout those 12 years, one thing became increasingly obvious to me: nobody will ever call you to let you know how well the network is working. Never in my 12 years of network administration did I get a phone call to let me know the e-mail was working, that users could print without glitches, and that files were available without problems. The phone calls I received were always at 0500 on Saturday morning, with the caller screaming about the network being down. That taught me two things:
The people who called me at 0500 were usually the ones that broke the network in the first place.
Information technology is working properly only when users can stop thinking about how or why it works.
While there is no sustained learning in the first observation, the second is an example of what I call the "principle of transparency.” Users, unlike many administrators, are not interested in technology for technology's sake. In fact, they are not interested in technology at all. The users just want the technology to work, so they can get their jobs done, without having to think of why or how. The ultimate challenge for information technology is to be invisible – completely transparent to the user. Every time users have to think about the technology it is because something is not working the way it should or because they cannot access resources they want. When managers have to think about technology it's often because they need to spend more money on it. Fundamentally, the network administrator's job is to make him or herself invisible.
So, how does that relate to security? The problem is that while network administration is about ensuring that users can get to everything they need, security is about restricting access to things. A colleague of mine used to quip "got an access denied? Good, the security is working.” That means that security administration is fundamentally opposed to network administration – they are, in fact, conflicting goals. Hence, we have an elemental trade-off that we need to consider.
Transparency can take many forms. The technology should be easy to use. However, technology acceptance research, by Professor Fred Davis in Management Information Systems, has proven that technology also needs to be useful – that it needs to have some kind of compelling functionality – to be accepted by users. I will group all of these concepts into the term "usable”. Essentially, the tradeoff is between security and usability. The most secure system is one that is disconnected and locked into a safe.
This has implications for all software technologies. When you install an application on to any operating system, you enable additional functionality which may make the system less secure since it increases the attack surface of the system. In a later column we will discuss the environmental aspects of security hardening and look at how you analyze the usage scenario to optimally harden a system.
We can make any technology more secure, but by doing so we will probably make it less usable. So, how do we make it more secure and more usable? This is where the third axis of the tradeoff comes into play. Any good engineer is familiar with the principle of "good, fast, and cheap.” You get to pick any two.
Recently, I was visiting a customer to help them design a network architecture for security. During the discussion, it became clear that people were struggling with the tradeoff between security and usability. By making the network more secure in some ways, they would have to make it less usable in other ways. After about 15 minutes of this discussion I went up to the white board and wrote:
Then I turned to the CIO and told him he gets to pick any two of those. He thought about it for a few seconds and then said "OK. I'll pick secure and usable.” All of a sudden, everyone knew what they had to work with and the discussion turned toward what resources they needed to expend to make the system both secure and usable.
This fundamental tradeoff between security, usability, and cost is extremely important to recognize. Yes, it is possible to have both security and usability, but there is a cost, in terms of money, in terms of time, and in terms of personnel. It is possible to make something both cost efficient and usable, and making something secure and cost-efficient is not very hard. However, making something both secure and usable takes a lot of effort and thinking. Security takes planning, and it takes resources.
Making system or network administrators manage security is counter productive as those job categories then would have conflicting incentives. As a system or network administrator, your job would be to make systems work, make the technology function without users having to think about it, making the technology transparent. As a security administrator, your job is exactly the opposite. Only someone with a split personality could be successful at both. Trying to please both masters is a bit like acting out Dr. Jekyll and Mr. Hyde. What will get you a good performance review in one area is exactly what will cost points in the other. This can be an issue today because many who manage Infosec are network or system administrators who are also part-time security administrators. Ideally, a security administrator should be someone who understands system and network administration, but whose job it is to think about security first, and usability second. This person would need to work closely with the network/system administrator, and obviously the two roles must be staffed by people who can work together. However, conflict is a necessity in the intersection between security and usability. Only by having two people with different objectives will you be able to find the optimal location on the continuum between security and usability for your environment.
There are actually several ways to address this tradeoff. Each vendor's technology is used in many different organizations. If we use "effort” as a proxy for the "cheap” axis on the tradeoff, we can see that the amount of effort the vendor expends in making their technology usable as well as secure will offset the amount of effort customers have to expend on the same task. The equation is effectively:
The relationship is not directly 1-to-1 because everyone differs in efficiency. In other words, not everything the vendor does to make the product more secure and usable will actually benefit the customer. However, some portion of the effort that a vendor expends on making the product more secure and usable will benefit customers.
To see an example of this, one needs to look no farther than IPSec in Windows 2000 and higher. IPSec is arguably one of the most useful security technologies available in Windows and many other non-Windows operating systems. For example, IPSec was one of the fundamental protection mechanisms used in Microsoft's successful entry in eWeek's OpenHack IV competition in 2002. (For more information on how the Microsoft entry in OpenHack IV was protected, see http://msdn.microsoft.com/library/en-us/dnnetsec/html/openhack.asp). It is incredibly versatile. It is also the poster child for user unfriendliness. Most people do not get over the clunky user interface. If you manage to get over that you usually run into one of the truisms about IPSec: it is a lot better at blocking traffic than it is at allowing traffic. There are few analysis tools to help figure out why traffic is not making it through. In the most recent release of Windows, Windows Server 2003, the network monitor was enhanced to allow it to parse IPSec traffic, greatly decreasing the troubleshooting effort customers need to invest to understand IPSec. With more effort expended by Microsoft at making IPSec usable, the deployment effort expended by customers would go down greatly, thus decreasing the cost to make networks both secure and usable. What we have is a teeter-totter effect between vendor cost and customer cost.
What this really means is that you very often get what you pay for. A product that costs more should also be more secure and usable than a product that costs less. There are of course other factors that come into play here, but these tradeoffs hold in general.
Security administrators face some interesting tradeoffs. Fundamentally, the choice to be made is between a system that is secure and usable, one that is secure and cheap, or one that is cheap and usable. We cannot have everything. The best practice is not to make the same person responsible for both security and system administration. The goals of those two tasks are far too often in conflict to make this a job that someone can become successful at.