Security WatchThe Challenge of Information Security Management, Part 1
Jesper M. Johansson
Can We Get Rid of the Technology?
What We Really Protect
The Defender's Dilemma
Keeping the Lights Off
When You Neglect Information Security
Accountability at the Government Level
Inappropriate Acceptance of Risk
For a while now, I have been thinking about the broader challenge of Information Security ( InfoSec) management in an organization. It seems the older I get, the farther away I get from technology—or, rather, the more I realize technology is fundamentally not the solution to our problems. Technology, in so many ways, is the problem. In fact, InfoSec Management is almost exclusively about preventing the kinds of problems we get into with technology.
I originally thought it would be interesting to write a book on this topic, but I eventually decided that a series of essays would be more appropriate. And here's where these essays will live. Over the next year or so, I will use the Security Watch column to occasionally visit the topic of Information Security Management. In this first installment of the series, I look at the fundamental principles of InfoSec management.
Can We Get Rid of the Technology?
Technology, for good and for bad, is inevitable today. Technology empowers organizations to do more, faster and more efficiently. It also enables criminals to do the same thing. And as much as many of us in the technology fields would like to believe that technology exists for technology's own sake, it does not. Technology exists to store, process, and transmit data—data that can be turned into information. Data and the information we derive from it are our truly valuable resources.
Shouldn't we, as security professionals, then call ourselves "data security" professionals? Are we not more accurately engaged in protecting data? In many ways, this is true, but data is just a raw material. Certainly it has value on its own, but it is putting the data together and creating information from it that really provides value. Therefore, we work at a layer of abstraction slightly higher than data. I will use the term "information" (except when I really wish to speak about raw materials) because that is what we eventually turn the data into.
Many technology professionals work at layers of abstraction below data—the bits, bytes, and electron layers. We like technology for technology's own sake. It is very often why we went into the technology field in the first place. Technology is more comfortable for us than, say, people. However, as Information Security professionals, this is dangerous because we must have a far broader scope than just technology. While Information Security professionals must be experts in one or more technology domains, we must look at the whole ecosystem of technology, information, people, and processes—the four pillars of information security. We must, by necessity, focus on all four pillars. The truth is, however, that many people ignore one or more.
Several years ago, Microsoft embarked on a Trustworthy Computing campaign, proclaiming that security is people, process, and technology. Many were skeptical, saying that Microsoft was trying to shift focus from poor security technologies to factors it could not control. That was an unfair analysis, though, considering the security of Microsoft solutions, particularly in Windows XP SP2 and more recent operating systems, is actually quite good. Windows Vista and Windows Server 2008, and particularly Windows 7 and Windows Server 2008 R2, are exemplary in most respects when it comes to security and are head and shoulders above any competing solution.
Another, kinder, take was that the People, Process, and Technology mantra was simply stating that InfoSec professionals need to consider people and process, as well.
However, the people-process-technology triad failed to take into account the basic purpose of all we do. The process is how people use technology to turn data into information so that they can make informed decisions.
What We Really Protect
Then, why is it that I say that information, and the data it is derived from, is our only valuable resource? Clearly, a more technologically advanced company with better processes is more capable of achieving its objectives than one without. I think we can take that as a given. But from a security perspective, process and technology are problematic. Technology, by definition, adds complexity, and complexity breeds insecurity. Components of a system must interact, and the number of interactions is exponentially related to the number of components. Each of those interactions creates new avenues to transmit data, new mechanisms to process it, and new storage locations. All of those elements represent possible areas of weakness, as well as weak points that reduce our ability to visualize the technology and the problems it may pose. The less complex a technology, the easier it is to understand and secure. In fact, inability to easily visualize and understand the scope of technologies in use in an organization is one of the key causes of information security problems. As InfoSec professionals, we should consider reducing the amount of technology being used, not increasing it.
Processes are also complex, nebulous, and, in most cases, not sensitive in and of themselves. A process may provide competitive advantage, but a process cannot be copied per se. It is the information documenting the process that can be copied. A process is ephemeral—it disappears. Only by turning it into data can one copy it. This is similar to music. Music exists only temporally as we listen to it. To enable people to listen to music over and over again, Thomas Edison invented the phonograph (an ingenious piece of technology that permitted us to record the process of creating music as data so that people could listen to it again and again).
Now, more than one hundred years later, people of elastic morals and questionable virtue steal the data representation to feed the process of playing music. The music industry, in a failed attempt at thwarting this trend, turned to security technologies to obfuscate the process of re-creating music. The result is primarily to make it more difficult for its legitimate customers to enjoy their music. This technology, known as Digital Rights Management ( DRM ), is practically useless in combating theft, although it is very successful at combating customer satisfaction. Looking at this rationally, you realize that the problem is really about securing information, not processes. In some cases, as in music, the data essentially cannot be protected, necessitating its owners to simply accept that fact and move on.
This leads to a number of interesting observations. For example, can't you just obfuscate the process? Can't you make the process of creating value from data a secret? Can't you create a secret algorithm, as it were? Well, you can, but it would usually do no good. Claude Shannon put this very bluntly as "the enemy knows the system," a pronouncement commonly known as Shannon's Maxim.
What does this mean? The algorithm itself typically cannot remain secret. The process will become known and the competitive edge comes from the difficulty of implementing the process. Information Security must, once again, focus on protecting information as the core valuable asset. Securing processes is only interesting insofar as it provides protection for the information.
The Defender's Dilemma
This leads me to the "defender's dilemma." Your job is to protect all your data, including where it is exposed in all interactions between your components, under the assumption that the bad guys know the system. The bad guys, on the other hand, need to find only one way to compromise your system. They do not need multiple copies of your data. One copy will suffice, regardless of how they get it. Just as one copy of an MP3 file is enough to fuel the entire criminal ecosystem, so is a single copy of your data enough for your adversaries. Furthermore, theft using any one approach cannot be undone. There is no undo switch in cyberspace.
The defender, therefore, must protect all possible points. You must provide adequate protection for data at rest and in flight, on all devices, no matter who owns the devices. The more places information is stored, processed, and transmitted, the more difficult your job becomes. Complexity is the enemy of security!
Note the use of the term "possible." In many cases, such as with DRM technology, it is impossible to provide protection for technical reasons. In such cases, the defenders must find a way to live with a "lossy" system or prevent all access to the data. In general, data that is distributed to would-be adversaries can never be protected. If data needs to be protected, the only way to do so is to never distribute it.
Because of our natural aversion to complexity, the InfoSec function is often perceived as a road-block. Those of us in the InfoSec field often, and largely correctly, view our jobs as preventing access to things.
The InfoSec group is where you go if you want your project to be killed. The evil security guys will always try to stop you from doing what you want. That is often true, and it is unfortunate, but not because the evil security guys shouldn't try to minimize complexity and stop bad ideas. It is unfortunate because it highlights a fundamental disconnect between the business and the security group.
Keeping the Lights Off
If you have ever studied for an information security exam, you've undoubtedly learned about the CIA triad—Confidentiality, Integrity, and Availability. It provides a way to describe the objectives in protecting information. We must provide confidentiality of the information against those who should not have access to it, integrity to ensure that the information is accurate, and availability for those who need access to the information. Consequently, some security groups consider much of their job to be about ensuring that the lights stay on, that availability is assured. Others assume the exact opposite approach and consider their jobs to be keeping the lights off—preventing anyone, including those with legitimate needs, from accessing the information.
I would argue that, if you have an appropriate partnership with the business, the security group can ignore the availability part of the triad. The business has other people who are more expert in availability and service level agreements. If security simply partners with them and makes sure that an adequate trade-off between confidentiality/integrity and availability is reached, the security function can consider availability to be of secondary importance. Security then becomes to some extent about ensuring the lights are off, and stay off; but while taking business needs into account. The business, left to its own devices, will ensure that the lights stay on. Security just needs to help the business ensure that only the correct lights stay on and others stay off. Security, in a sense, is then a reasonable white listing function.
This is one of the things I find most fascinating when working with other security people. We walk into a meeting with the business executives. The executives say "we need you to help secure our product." The security guys say "OK, tell me about the product." The business folks say "It's a widget"; at which point the security guys immediately start telling them how to secure widgets.
What is missing here? Sure, the security guys tried to learn what the product was. But what is the business objective? What is the business trying to achieve with this widget? How valuable is it to the business? How strategic is it? How important is it? How much risk is the business willing to accept to get it done? Do the business folks even want to build it?
The security group so rarely knows the business. Yet the security folks pretend to understand so they can tell the business how to do things. It is not our job as InfoSec professionals to tell the rest of the organization how to run a business. It is merely our job to inform the business as to the correct set of lights to turn on, and which ones must stay off, in accordance with the business' tolerance for risk and its needs. We support and advise the business on how to achieve its objectives with an acceptable level of risk—but the objectives are still owned by the business, not by the InfoSec group.
When You Neglect Information Security
I have finally arrived at risk. InfoSec is really information risk management. We manage the risk to our information assets. Or, at least we are supposed to. But in many cases, we simply are not allowed to. Security is a really hard sell because the upside is so unclear. What, after all, is the benefit of security? What is it that constitutes success? Is it merely "we did not get hacked this year?" You can never make that statement and be assured it is correct—you may have been hacked, but failed to notice. In fact, failing to invest enough money and time on InfoSec is a really good way to ensure you won't notice when you are hacked.
There are many potential consequences of neglecting InfoSec and, in some cases, an appalling lack of consequences. In some areas, neglect can cause you to lose your job or result in criminal charges. Failing to protect the privacy of children, for instance, is criminal in the United States under the Children's Online Privacy Protection Act (COPPA), and in other countries, such as Canada and Australia, under national statutes.
For a business that is based on customer trust, and where customers find it obvious that there is a risk, security is a cost of doing business. In a world where people are already nervous, and switching costs are negligible, a single breach can doom a business. A single major and very public breach in the world of online banking, for example, would probably be enough to set online banking back years.
This, however, does not seem to hold in other industries. Take the TJX Companies or Heartland, the credit card processor, for example. Even after appalling levels of neglect for security led to the theft of the credit cards of virtually every American who has one of these cards, the companies are still in business. The TJX breach was made public in early 2007. That same year, the company's CEO, Bernard Cammarata, enjoyed a salary of $911,539 plus about $1.6 million in stock and other compensation. The president of the company, Carol Meyrowitz, earned a posh $7.5 million while overseeing the organization that enabled the largest credit card theft in history and then failed to notice when it happened—a figure that pales in comparison to the cost incurred by her customers and their credit card companies. The exact cost of the Heartland breach is unclear as of this writing. However, I have no doubt its executive management will be handsomely rewarded for the valiant way it dealt with a crisis that was entirely avoidable had it simply implemented the most basic information security measures. Neglect and rationalization are still virtues in far too many companies. Clearly, we have a long way to go when it comes to accountability for InfoSec.
Accountability at the Government Level
On the topic of accountability, I recommend that you read the report from the Center for Strategic & International Studies (CSIS) entitled "Securing Cyberspace for the 44th Presidency." The report, which was published in December 2008, was written by U.S. Representatives James R. Langevin and Michael T. McCaul, along with Microsoft VP of Trustworthy Computing, Scott Charney, and Lieutenant General Harry Raduege, USAF ( Ret. ). The objective was to lay out a strategy for the incoming Obama administration around cybersecurity. More interesting, however, is the critical assessment of how cybersecurity was neglected under the previous administration; the report states that "cybersecurity is now a major national security problem for the United States."
Interestingly, the report advocates a position long opposed by the software industry, with Microsoft at the forefront of the opposition: the use of procurement rules to drive a desired direction in information technology products—specifically software. The report does not mince words when detailing how important this is. It specifically points out that cybersecurity is a "battle we are losing." Furthermore, it says "weak cybersecurity dilutes our investment in innovation while subsidizing the research and development efforts of foreign competitors." It is no stretch to take that same statement and apply it to almost any organization's InfoSec efforts.
Figure 1 Do you really have to sign your credit cards?
Inappropriate Acceptance of Risk
Many of the failures come from an inappropriate acceptance of risk. Human beings tend to underestimate risks and overestimate the benefits. We particularly underestimate risks in areas we find difficult to understand, such as cyberspace. We can very easily visualize physical risks. Most people lock their cars, even though the potential downside of a car theft is a $500 insurance deductible and being inconvenienced for a few days. We lock the doors to our houses, even in places where burglaries are very rare. However, we throw bank statements in the garbage, literally handing criminals all the information they need to steal everything we own. We sign the backs of our credit cards and store them alongside our checkbooks. Put these ingredients together and a criminal has everything he needs to empty your checking account. It is for that very reason that my credit cards look like the card shown in Figure 1.
One of the most critical aspects of InfoSec management is to have an accurate perception of risk. This is where the InfoSec group comes in. It is up to this core advisory group to help the business understand the risks it is accepting, ensuring the business understands the risks and values them properly. We have traditionally been far too simplistic about how we value risk. In my May 2008 installment of the Security Watch column, entitled "Principles of Quantum Security," I introduced a revised version of the Annual Loss Expectancy (ALE) equation used to asses risk (see Figure 2).
Figure 2 The revised Annual Loss Expectancy (ALE) equation
However, this is not just about valuing risk. Being a trusted advisor goes beyond that. In a future piece in this series on InfoSec Management, I will discuss risk in far more depth.
At this point, we are getting closer to the true purpose of InfoSec. I argue that there are four central aspects, all rooted in the overriding principle that should guide InfoSec:
Keep Risk at an Acceptable Level The job of the InfoSec professionals, first and foremost, is to keep risk at an acceptable level. This includes the things I've discussed here—making sure the lights stay on, turning off the right lights, and generally ensuring that information is available to those who need it and not to those who do not. The core problem with keeping risk at an acceptable level is establishing what is meant by "acceptable." It turns out that acceptable is influenced by many factors, which I will discuss in a later article.
Enable the Business First and foremost, the InfoSec group is a member of the business. Your job is to enable the business, not to stop it. InfoSec professionals who view their role as being to protect the business from itself rarely experience much long-term success. They also will not be very busy as much of the business will simply avoid them and proceed without input from InfoSec, often making even more inappropriate risk management trade-offs. Once again, we are here to protect the business and help it achieve its objectives, while managing risk.
Advise on Risk InfoSec has an operational role, managing network devices, security software, and processes (such as patch management and incident response). This operational side is where InfoSec is often most visible. The part that can have the broadest impact, however, is the advisory capacity. As an advisor, InfoSec should act as an internal consulting resource, lending a security perspective to projects far and wide. This can be a very fulfilling role for InfoSec professionals who enjoy creating direct value for the company.
Establish Risk Management Policies and Processes Finally, InfoSec manages the overall information risk posture through policies and processes. That means that the InfoSec group must assess the risk to the business, establish policies, and define processes. The assessment of risk consists largely of an analysis of how the organization is managing risk and what its preferred posture should be. Based on that philosophy, InfoSec establishes a set of security policies to codify the organization's risk posture and the principles of managing them. These policies are then implemented in a set of processes to assist the organization with compliance.
In this article, I have looked at a basic blueprint for InfoSec management. The key here is to understand that InfoSec is a core part of the business and must be a trusted advisor to the rest of the business. Rather than being in conflict with the business, the InfoSec group must establish a relationship with other parts of the business to help the whole business achieve its objectives with an acceptable level of risk. Only then can InfoSec be effective.
But this is just the start of the conversation. Watch for future installments of the Security Watch column when I continue this series on Information Security Management.
Jesper Johansson is Principal Security Architect for a well-known Fortune 200 company, working on risk-based security vision and security strategy. He is also a contributing editor to TechNet Magazine. His work consists of ensuring security in some of the largest, most distributed systems in the world. He holds a Ph.D. in Management Information Systems, has more than 20 years experience in security, and is an MVP in Enterprise Security. His latest book is the Windows Server 2008 Security Resource Kit (Microsoft Press, 2008).