Planning

Figure 4 illustrates the primary activities that occur during the Planning Phase. While the other teams are developing images, project plans, and the like, the Security feature team focuses on the existing production environment from a security perspective; the team determines whether changes should be made in the computer images to reduce the risk of new or different security threats to the organization and, if so, how best to incorporate them.

Figure 4. Activities during the Planning Phase

Figure 4. Activities during the Planning Phase

On This Page

Roles and Responsibilities Roles and Responsibilities
Desktop Security Planning Considerations Desktop Security Planning Considerations
Planning Computer Roles for Security Policies Planning Computer Roles for Security Policies
Planning System Security Settings Planning System Security Settings
Planning User Account Control Planning User Account Control
Planning Windows Firewall Planning Windows Firewall
Planning Network Access Protection Planning Network Access Protection
Planning Intrusion Detection Planning Intrusion Detection
Planning Data Encryption Planning Data Encryption
Planning BitLocker Drive Encryption Planning BitLocker Drive Encryption
Planning Recovery Planning Recovery
Planning EFS Planning EFS
Planning RMS Planning RMS
Restricting the Use of Removable Storage Devices Restricting the Use of Removable Storage Devices
Planning Internet Explorer Security Planning Internet Explorer Security
Planning Application Security Planning Application Security
Planning Windows Defender Planning Windows Defender
Non-Microsoft Security Applications Non-Microsoft Security Applications
Choosing an Update Methodology Choosing an Update Methodology
Windows Server Update Services Windows Server Update Services
Infrastructure and Deployment Security Infrastructure and Deployment Security
Milestone: Migration Plan Accepted Milestone: Migration Plan Accepted

Roles and Responsibilities

All six role clusters from the MSF Team Model play a role in the Planning Phase of the initiative. Table 1 lists those roles and focus areas relative to the deployment process in the Planning Phase.

For more information about MSF Team Model role clusters, see the MSF home page at http://www.microsoft.com/technet/itsolutions/msf.

Table 1. Roles and Responsibilities During the Planning Phase

Roles

Focus

Product Management

  • Business requirements analysis

  • Risk analysis

  • Communications plan

Program Management

  • Project plan and project schedule

  • Budget

Development

  • Technology evaluations

  • Vulnerability analysis

  • Logical and physical design

  • Development plan and schedule

  • Establishment of the lab

Test

  • Testing requirements definition

  • Test plan and schedule

  • Application compatibility testing with Windows and restrictive security settings

User Experience

  • Usage scenarios

  • User requirements

  • Localization/accessibility requirements

  • User documentation

  • Training plans

  • Schedules

Release Management

  • Operations requirements

  • Pilot and deployment plan/schedule

  • Network discovery

  • Application and hardware inventory

  • Communication with information technology (IT) Operations and the Security feature team

Desktop Security Planning Considerations

During the Planning Phase, the Security feature team analyzes desktop security risks and determines the most cost-effective way to mitigate significant vulnerabilities. Many high-level and low-level decisions must be made about how security will be applied to the new computer images.

After the Security feature team has made these decisions, it must then consider security requirements that vary from the default Windows security settings and make the necessary changes to the client image security settings or the Group Policy objects. Then, it must consider application security and choose an update methodology that IT Operations will maintain.

Client Security in the Solution Accelerator for BDD 2007

Windows provides hundreds of security settings that Security feature team members can review and adjust to meet the organization’s needs. To simplify this decision making process, choose a starting point, and then modify those baseline settings as necessary. Microsoft provides the following baselines:

  • Default Configuration. In this grouping, the Windows image is essentially unchanged. It is configured with the same features and security settings that are provided when Windows is installed from the original media.

  • Enterprise Client. In this grouping, security policies are applied that are more restrictive than the default Windows configuration; these policies are targeted at a typical corporate enterprise computer. This option focuses on securing the computer but allows more user functionality than higher security options. Within the EC settings, the Windows XP Security Guide and Windows Vista Security Guide provide both laptop and desktop configurations. These settings best suit most BDD 2007 users.

  • Specialized Security – Limited Functionality (SSLF). In this grouping, security policies are applied that are the most restrictive of the three options; these policies are targeted at computers that have stringent security requirements. This option focuses on securing the computer and is conservative in providing more security by restricting user functionality. Using this scenario requires significant compromises; while security is increased, engineering time will be increased and usability will be decreased. Within the SSLF settings, the Windows XP Security Guide and the Windows Vista Security Guide provide both laptop and desktop configurations.

By default, BDD 2007 enables the default configuration. For information about obtaining the files to apply EC or SSLF Group Policy settings in an Active Directory environment, refer to the Windows XP Security Guide at http://www.microsoft.com/technet/security/prodtech/winclnt/secwinxp/xpsgch05.mspx or the Windows Vista Security Guide at http://go.microsoft.com/?linkid=5637271.

The Security feature team has responsibility for analyzing the security requirements for the organization and recommending to the project management team which options should be applied. The final decisions will be included in the functional specification document.

Risk analysis includes an ongoing component, too. During the Planning Phase and throughout the entire project, the Security feature team participates in all review meetings and performs an ongoing threat analysis and awareness program to ensure that appropriate team roles are aware of existing or potential security issues in their areas of responsibility.

Planning Computer Roles for Security Policies

Different client computer roles may require different security settings. For example, developers may require more privileges than Standard users. Specialty implementations such as kiosk computers may require heavily restricted computer settings. For detailed information about computer roles, refer to the Windows Vista Security Guide at http://go.microsoft.com/?linkid=5637271 and the Windows XP Security Guide at http://www.microsoft.com/technet/security/prodtech/winclnt/secwinxp/xpsgch05.mspx.

Planning System Security Settings

The security of client computers is determined by the collection of thousands of different settings that directly or indirectly affect the security of the operating system and applications. Determining these configuration settings can be a daunting task. Instead of considering each security setting individually, start with one of the previously described options included with BDD 2007.

None of these options may exactly meet the organization’s security requirements, however. To fine-tune these settings, consider the security requirements that might be unique for the organization in the following categories:

  • Active Directory

  • User accounts

  • Group memberships and limited users

  • Password settings

  • File permissions

  • Registry permissions

  • Service permissions

  • Event log and auditing settings

  • User rights settings

  • Security options

The sections that follow provide conceptual information about each of these categories. If while reviewing each of these sections Security feature team members think of exceptional security requirements that the organization may have for the category, team members should investigate specific settings in more detail. For detailed information about specific security settings, refer to the Windows XP Security Guide at http://go.microsoft.com/fwlink/?LinkId=14839 or the Windows Vista Security Guide at http://go.microsoft.com/?linkid=5637271.

Note   In addition to the considerations described in this section, consider confidentiality when redeploying previously used computers. Using tools such as Cipher.exe, which is included with Windows XP SP2 and Windows Vista, Security feature team members can permanently delete all data on a hard disk. For instructions on using Cipher.exe, read “Appendix A. Using Cipher.exe to Wipe a Used Hard Disk Clean.”

Active Directory

The first decision to consider is whether the computers being deployed will be part of an Active Directory domain or whether an alternative approach will be used. If the computers will be joining an Active Directory domain—that is, either a Microsoft Windows 2000 domain or a Microsoft Windows Server 2003 domain—the Security feature team can use GPOs and Active Directory to centrally define, distribute, and manage the security settings for the Windows computers.

Being able to take advantage of GPOs for security can dramatically reduce ongoing security costs. Even if teams deploy desktops with hardened settings that meet the organization’s security requirements, those settings can degrade over time. For example, a new administrator troubleshooting an application problem might add a Standard user to the local Administrators group. Without GPOs or another form of security configuration management, the user will continue to have elevated privileges indefinitely and will be more vulnerable to attacks from worms and viruses. In an environment that uses GPOs, restricted groups can be used to ensure that users do not receive elevated permissions, thereby enforcing the security settings determined during the security risk management process throughout the lifetime of the computer.

User Accounts

By default, the Windows operating system includes several user accounts. The most significant of these are the Guest and Administrator accounts. By default, the Guest account is disabled, and it should always remain that way. Windows might use other accounts for optional components, such as Microsoft Internet Information Services (IIS). In environments that use Active Directory, user accounts should always be stored within Active Directory. No additional local user accounts are usually necessary, and adding accounts increases security risks.

For detailed information on planning user accounts for both Windows XP and Windows Vista, see “Managing Authorization and Access Control” in the Windows XP Professional Resource Kit at http://www.microsoft.com/technet/prodtechnol/winxppro/reskit/c17621675.mspx.

Group Memberships and Limited Users

Most user privileges are assigned to groups, and the Security feature team makes users members of groups to grant them permissions. Windows has several built-in groups, but the most significant are:

  • Administrators

  • Backup Operators

  • Guests

  • Power Users

  • Remote Desktop Users

  • Users

Many of the most critical desktop security decisions involve deciding which users will be made members of these groups. The single most critical decision involves whether to add end users to the Administrators, Power Users, or Users groups. The Administrators group offers users complete control over their local computers, and, as a result, they will never have permissions-related problems when installing applications, running software, or configuring their computers. However, Admin Approval mode (enabled by default and available only in Windows Vista) will still prompt the administrator for confirmation before allowing administrative changes. This behavior increases security risks, however, because even Admin Approval mode may not prevent viruses, worms, spyware, and other types of malware from infecting the computers. Additionally, users might install software such as games that can reduce their productivity or decrease the reliability of their computers.

In contrast, adding end users to only the Users groups reduces security risks by providing an extra layer of protection against malware. Additionally, it is much more difficult for users in these groups to install unwanted applications that might reduce their productivity or affect the reliability of their computers. The disadvantage of limiting user privileges is increased engineering time: The IT department must spend more time testing applications to ensure they run properly with restricted permissions. This is particularly challenging with legacy applications, which were typically designed to be run by members of the Administrators group. The Windows Vista UAC feature significantly reduces this cost by virtualizing sections of the file system and registry to enable uncertified applications that require administrative privileges to run successfully.

For detailed information on planning Group memberships for both Windows XP and Windows Vista, see “Managing Authorization and Access Control” in the Windows XP Professional Resource Kit at http://www.microsoft.com/technet/prodtechnol/winxppro/reskit/c17621675.mspx.

In Active Directory environments, local groups are also used to assign privileges to Active Directory groups. For example, the Active Directory Domain Administrators group is automatically added to the local Administrators group when a computer running Windows joins a domain. Carefully grant domain groups, such as support center operators and backup operators, permissions to local computers by adding them to the appropriate local groups. To help enforce these group memberships, use the Restricted Group policy in Active Directory environments. For more information about restricted groups, read the Microsoft Knowledge Base article, “Description of Group Policy Restricted Groups,” at http://support.microsoft.com/Default.aspx?kbid=279301. Also, consider using a tool such as Microsoft Systems Management Server (SMS) or Microsoft Operations Manager (MOM) to monitor and detect group membership changes anywhere in the organization.

Note   The Application Compatibility Toolkit (ACT) is a valuable resource for preparing applications for protected user accounts to use. To download the ACT, see Windows Application Compatibility at http://www.microsoft.com/windowsserver2003/compatible/appcompat.mspx. For more detailed information about running applications within protected user accounts, refer to the Application Compatibility Feature Team Guide.

Password Settings

Although Windows and the Active Directory infrastructure support flexible authentication methods, such as smart cards and biometrics, passwords are the default authentication mechanism. All user accounts must have passwords, including local user and domain accounts. Although the default Windows Server 2003 Active Directory password settings provide an excellent compromise between security and usability, Security feature team members may have to modify one or more of the password requirement properties:

  • Password length. Longer passwords are harder for attackers to guess or crack but are easier to mistype and harder for users to remember. Security risks decrease with longer passwords, but account management increases.

  • Password complexity. Requiring complex passwords prevents users from using passwords such as first names, which are easily cracked with password dictionaries. However, complex passwords are much harder for users to remember.

  • Frequency of password changes. It can take months for an attacker to successfully guess a user’s password. Changing passwords regularly significantly reduces the risk of an attacker successfully guessing a password. In addition, if an attacker does guess a password, changing the password can prevent him or her from reentering the system. However, password changes tend to annoy users, and users are more likely to forget passwords if they regularly change them, which increases the number of help desk calls. Even worse, frequently changing passwords can cause users to write down passwords, which increases the chance that an attacker can more easily compromise the password. Best practices call for changing passwords, but weigh the reduced security risks against the costs to user productivity.

In Windows Server 2003 Active Directory environments, the default password settings are sufficient for most environments. To reduce the risk of the local Administrator account password being compromised, change the name of the account, and change the password on all computers regularly. Non-Microsoft tools, such as Foghorn Security’s Local Account Password Manager (LAPM), can simplify the management of local passwords.

File Permissions

File and folder permissions enable users to restrict access to content stored on NTFS file system volumes. Security feature team members can grant access to open, edit, or delete files and folders. Files and folders also have the concept of ownership: the user who creates a file or folder is the owner of that object and, by default, has the ability to specify other users’ level of access to the file or folder.

File and folder permissions are the most important way to protect confidential data and system integrity. Users often store confidential documents on their computers, so ensure that the default file permissions assigned to new documents (most likely located in the My Documents folder) are restrictive enough so that other users cannot access them across the network. Also, use file and folder permissions to prevent users from modifying important system and application files. By granting a user only Read access to system files, Security feature team members effectively prevent him or her from applying unapproved updates or installing many types of malicious software.

It is important to understand that file and folder permissions have a significant weakness: They are only effective when the operating system is running. Particularly with portable computers, assess the risk of an attacker's gaining physical access to a computer. With physical access, an attacker can directly access a computer’s hard disk and bypass file permissions, regardless of whether the attacker has valid user credentials. To protect against this type of offline attack, use physical security, BitLocker Drive Encryption, and the EFS. For more information about BitLocker Drive Encryption, refer to “BitLocker Drive Encryption” earlier in this document.

Registry Permissions

The registry is primarily used to store configuration information for the operating system and applications. Although the registry is rarely used to store confidential data, restricting registry permissions is important for protecting the integrity of the system. Attackers who modify specific registry keys might be able to capture user passwords, gain elevated privileges, or render a computer inoperable.

The registry is also the only way to configure several important client security settings. For example, by default, the DeadGWDetectDefault registry entry (located within the HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters subkey) is set to 1 for Windows clients. This setting enables clients to automatically switch to a working router if the default router fails. However, there is a theoretical risk that a very sophisticated hacker could abuse this feature to redirect a computer’s network traffic. Therefore, consider changing this value to 0.

For a complete discussion of registry settings, refer to the Microsoft TechNet document, “Threats and Countermeasures,” at http://www.microsoft.com/technet/security/topics/serversecurity/tcg/tcgch00.mspx, the Windows XP Security Guide at http://go.microsoft.com/fwlink/?linkid=14840, and the Windows Vista Security Guide at http://go.microsoft.com/?linkid=5637271.

Service Permissions

Services run in the background to perform operating system tasks. Services are one of the key differences between Windows XP and Windows Vista. In Windows XP, services often use the Local System account and have unlimited access to system resources. Unfortunately, attackers have historically targeted services to gain elevated privileges on a computer. To reduce that risk, configure services to log on with restricted user accounts, as shown in Figure 5. However, this requires extensive testing to ensure the configured service account has sufficient permissions to enable the service to function correctly.

Figure 5. Configuring service permissions

Figure 5. Configuring service permissions

In Windows Vista, services are configured with minimal privileges by default, and Windows Service Hardening provides an additional layer of protection to reduce the risk that a service will be exploited. Therefore, manually restricting service permissions is more important for Windows XP computers than it is for Windows Vista computers.

The Security feature team can also use Group Policy settings to control which users and groups have permission to start, stop, and pause services. Most environments need not edit the default service permissions.

Event Log and Auditing Settings

Operations groups often use security auditing and event logs to help maintain security after deployment. When enabled, security auditing adds events to the local event log whenever a user successfully or unsuccessfully attempts to perform the following types of actions:

  • Add user accounts

  • Change security settings

  • Modify user privileges

Theoretically, the Security feature team can use this information to track successful and unsuccessful attacks. In practice, the vast majority of security events are not related to security attacks, which reduces the usefulness of the audit logs. Organizations that require auditing can build on the security auditing built into Windows to fulfill this requirement. However, the most practical way to identify and trace attacks is to use non-Microsoft intrusion-detection software. Most environments need not edit the default event log and auditing settings.

User Rights Settings

User rights control which actions specific users and groups can perform on a computer. The following are examples of frequently configured user rights:

  • Allow Log On Locally

  • Debug Programs

  • Profile System Performance

  • Shut Down the System

By default, different groups (such as the local Administrators and Users) have different user rights. For example, Administrators, Backup Operators, and Users all have the Shut Down The System user right. However, only members of the Administrators group have the Debug Programs user right because it can be exploited to gain elevated privileges.

If the Security feature team grants users limited privileges by not adding them to the Administrators group, the team may have to grant additional user rights to enable specific applications to run or to enable users to perform specific tasks. For example, developers typically require the Debug Programs user right. The team will identify additional user rights requirements when testing applications on the desktop platform. After deployment, users may complain of problems using nonstandard applications, and administrators may have to adjust user rights on a user-by-user basis to enable these applications to run properly.

Security Options

Security options are a wide range of settings that do not fit into other categories. For example, the following are security options in Windows:

  • Accounts: Administrator account status. Use this security option to disable the local Administrator account. Doing so increases security by reducing the risk that an attacker will abuse the account. However, it makes it more difficult to repair a computer that cannot connect to an Active Directory domain, because the only way to access the computer using the local Administrator account will be to launch Recovery Console.

  • Accounts: Rename Administrator account. Use this security setting to change the name of the default account from Administrator to something different. Doing so makes it more difficult for attackers to use password-cracking techniques to authenticate, because they have to guess both the user name and the password.

  • Shutdown: Clear virtual memory pagefile. The pagefile often contains confidential information because it includes a copy of parts of the computer’s memory. Erasing the pagefile during shutdown reduces the risk of an attacker's gaining access to confidential information by accessing the contents of the pagefile directly. However, doing so slows shutdown and startup.

  • Interactive Log-on: Do not display last user name. Although displaying the user name of the last logged-on user can save users a few moments while logging on, it can also reveal user names to attackers, which makes password-guessing attacks much more effective.

Planning User Account Control

During the UAC planning for Windows Vista computers (Windows XP lacks this feature), the Security feature team collaborate with the Application Compatibility feature team. The Security feature team specifies UAC settings, and then the Application Compatibility feature team determines the best way to run each application in the restricted environment.

For detailed information about planning UAC, read Chapter 2, “Defending Against Malware,” of the Windows Vista Security Guide at http://go.microsoft.com/?linkid=5637271.

Planning Windows Firewall

Windows Firewall offers significant protection from worms, viruses, and other malware that attacks across a network. However, because Windows Firewall can filter all network communications, it has the potential to significantly disrupt legitimate applications. With the secure-by-default settings in Windows XP SP2 and Windows Vista, Windows Firewall will almost certainly cause one or more of any given organization’s useful applications to fail.

To prevent useful applications from failing while meeting the organization’s requirements for packet filtering and auditing, consider the following categories of Windows Firewall configuration settings:

  • Firewall profiles

  • Firewall exceptions

  • Firewall logging

The sections that follow provide conceptual information about each of these categories. If while reviewing each of these sections the Security feature team thinks of exceptional security requirements that the organization may have for the category, the team should investigate specific settings in more detail. For detailed information about specific security settings, see the Windows XP Security Guide at http://go.microsoft.com/fwlink/?LinkId=14839 and the Windows Vista Security Guide at http://go.microsoft.com/?linkid=5637271.

Firewall Profiles

By default, Windows Firewall includes two profiles:

  • Domain. Applies when a computer is connected to its corporate domain

  • Standard. Applies when a computer is not connected to its corporate domain

By having two profiles, Security feature team members can configure different policies and exceptions depending on whether the computer is connected to the domain. This is extremely useful for portable computers. For example, when a portable computer is connected to the domain, enable an exception that allows network management traffic through. To reduce the risk of this traffic being used in an attack when the computer is connected to a home network or wireless hotspot, simply do not add the exception to the Standard profile.

Specify exceptions and firewall settings on a per-profile basis. Per-profile settings that can be configured include:

  • Blocking or allowing inbound connections by default. By default, inbound connections are blocked for both profiles.

  • Blocking or allowing outbound connections by default (in Windows Vista only). By default, outbound connections are allowed for both profiles.

  • Allow local Administrators to create exceptions (in Windows Vista only).

  • Allow local Administrators to create computer connection security rules (in Windows Vista only).

  • Notify user when inbound connections are blocked.

  • Allow unicast response to multicast or broadcast requests.

  • Whether to log dropped packets and successful connections as well as the location.

  • IPSec settings, including key exchange, data protection, and authentication method (in Windows Vista only).

Firewall Exceptions

Firewall exceptions allow traffic through the firewall. Typically, exceptions are made for applications that must be able to accept inbound connections. Because outbound connections are allowed by default, applications that send traffic (which includes most modern applications) do not require a firewall exception.

The Application Compatibility feature team should determine the most restrictive exceptions that enable their applications to run properly. Typically, these teams request that a specific executable file be granted an exception so that it can accept inbound connections. Make note of this requirement as an acknowledged potential security risk. Every application that is granted a firewall exception can potentially be used to compromise a computer.

To further limit the risk posed by application exceptions, use the following countermeasures:

  • Use UAC to run applications, especially those that require exceptions, with Standard user privileges. Doing so limits the damage caused in a successful compromise and may prevent attackers from making permanent, computer-wide changes.

  • Enable Windows Defender, and keep the signatures up-to-date.

  • Enable antivirus software, and keep the signatures up-to-date.

In addition, consider using port exceptions rather than program exceptions. Ports are numbers included with packets that uniquely identify the application that is sending or receiving the network communication. For example, Web servers receive traffic labeled with port 80, while Remote Desktop receives traffic labeled with port 3389.

When creating a program exception, that exception allows traffic for any port the program chooses to use to listen for incoming traffic. For example, an application might listen for incoming connections on three separate ports. However, the team need accept connections only on one of those ports for the way the program is used. Allowing only the single required port reduces security risks by not allowing network attacks against ports that remain blocked.

Note, however, that port exceptions require additional engineering time compared to program exceptions. The Application Compatibility feature team must determine which port numbers an application uses. These numbers may be listed in the application’s documentation, or the teams may need to use a protocol analyzer such as Network Monitor to determine the port numbers in use.

For more information about Network Monitor, refer to the Microsoft TechNet article, “HOW TO: Capture WAN Traffic with Network Monitor in Windows,” at http://support.microsoft.com/default.aspx?scid=kb;en-us;301989&sd=tech.

Firewall Logging

Windows Firewall is capable of logging successful connections and blocked connection attempts. While it seems obvious that any security-aware organization would collect and analyze this data, most organizations should only enable firewall logging during testing and troubleshooting.

While logging successful connections would provide a record of a successful network compromise, it would log at least hundreds of thousands of legitimate connections, too. Finding a compromise in such a large log file is, in practice, impossible. In addition, the time required to aggregate, archive, and analyze those log files is never the best use of a security team’s time.

Similarly, blocked connections contain a great deal of traffic that can be ignored. Worms generate much of this traffic and may continuously attempt to establish network connections to computers with random or semi-random IP addresses. A firewall log provides identification of the source IP address of these computers, but the volume of attacks on the Internet makes them impossible to follow-up on. Therefore, in most circumstances, do not enable firewall logging on the base platform.

Firewall logging can be useful for creating a honeypot, however. A honeypot computer is placed on the network specifically to attract attacks. If Security feature team members enable logging on a single computer, they may be able to analyze the computer’s log files to identify other computers on the network from which potential attacks originate. Team members could then use this information to remove the malicious software from the attacking computers.

Planning Network Access Protection

Network Access Protection (NAP) has significant potential for keeping internal networks free of malicious software and for improving reliability of security updates and signature updates on portable computers. NAP depends on a Microsoft Windows Server code named “Longhorn” infrastructure, which is not currently available. Plan to leverage NAP after Windows Server “Longhorn” is released.

For more information on NAP, refer to the Microsoft NAP site at http://www.microsoft.com/technet/itsolutions/network/nap/default.mspx.

Planning Intrusion Detection

If the Security feature team chooses to implement an intrusion detection system, team members should work closely with the software or services provider to plan the maintenance and monitoring of the system. For detailed information about intrusion detection, read The Security Monitoring and Attack Detection Planning Guide at http://www.microsoft.com/technet/security/topics/auditingandmonitoring/securitymonitoring/default.mspx.

Planning Data Encryption

Data encryption reduces the risk of confidential information being accessed by unauthorized users. Encryption compliments file permissions and auditing by providing protection even if an attacker manages to bypass operating system security, such as by removing a hard disk from a computer. Windows supports three different data encryption technologies:

  • RMS. RMS encrypts documents and requires users to retrieve a decryption key from the RMS infrastructure before they can open the document. This process provides centralized control even if the document leaves the organization’s infrastructure, such as when mailing a document to a partner off the network. Applications must be certified to work with RMS, and certified applications can enforce granular permissions that control whether a user can copy, print, or edit documents. Team members can use RMS only with documents created with applications that support the technology, such as Microsoft Office 2003 and later.

  • EFS. The EFS is a file encryption technology built into Windows. Unlike RMS, the EFS does not require applications to support it. The EFS works transparently with the NTFS file system volumes to encrypt and decrypt specified files and folders as the user accesses them. The EFS is not capable of encrypting the entire volume and cannot protect the operating system itself.

  • BitLocker Drive Encryption. In Windows Vista, BitLocker Drive Encryption encrypts an entire volume, providing very strong data security. Because all system files are encrypted, it would be very difficult for an attacker with physical access to a computer to bypass BitLocker Drive Encryption to read, edit, or create files. For single-user computers, BitLocker Drive Encryption alone may be able to meet security requirements. Computers that are shared among multiple users and require files be protected from other users should use both BitLocker Drive Encryption and the EFS, which can encrypt files on a per-user basis.

Table 2 shows the data security scenarios that each technology supports.

Table 2. Data encryption and security scenarios

Scenario

RMS

EFS

BitLocker

Remote document policy enforcement

X

 

 

Protect content in transit

X

 

 

Protect content during collaboration

X

 

 

Local multi-user file and folder protection

 

X

 

Remote file and folder protection

 

X

 

Untrusted network admin

 

X

 

Portable computer protection

 

 

X

Branch office computers

 

 

X

Local single-user file and folder protection

 

 

X

For more information about RMS, visit the Windows Rights Management Services Web site at http://www.microsoft.com/rms. For more information about the EFS, read The Encrypting File System at http://www.microsoft.com/technet/security/topics/cryptographyetc/efs.mspx. For more information about BitLocker Drive Encryption, read Windows BitLocker Drive Encryption Step-by-Step Guide at http://www.microsoft.com/technet/windowsvista/library/c61f2a12-8ae6-4957-b031-97b4d762cf31.mspx.

Planning BitLocker Drive Encryption

BitLocker Drive Encryption, a feature new to Windows Vista, requires significant planning, because:

  • The Security feature team must purchase hardware with compatible TPM hardware (when available) or provide USB flash drives for all computers.

  • Lost keys or failed hardware can result in an unrecoverable hard disk if a recovery key has not been archived. In Active Directory domains, recovery keys can be stored within Active Directory.

The sections that follow discuss planning requirements in more detail.

Determining BitLocker Drive Encryption Requirements

Before starting BitLocker Drive Encryption deployment, define the security needs and data protection requirements of the computers in the organization. BitLocker Drive Encryption is a security feature that provides system integrity and data protection against physical attacks.

Identifying BitLocker Drive Encryption Scenarios

BitLocker Drive Encryption technology can help the organization meet various security requirements. For example, the organization could use BitLocker Drive Encryption for the following purposes:

  • To protect data on lost or stolen laptops, ensuring system integrity against offline software-based attacks

  • To lock a system when someone has tampered with it

  • To protect critical branch office computers and kiosks in the absence of adequate physical security

  • To enable data deletion and equipment recycling

  • To protect the corporate network against attacks from lost or stolen computers

Selecting Computers for BitLocker Drive Encryption

After identifying the security technologies that must be implemented to meet the needs of the organization, identify the categories of users and computers that will use BitLocker technologies. For example, BitLocker Drive Encryption might be based on job function, location, organizational structure, or a combination of these three.

For example, the Security feature team may want to implement TPM security on all portable computers in the organization to prevent unauthorized users from breaking Windows file and system protection on lost or stolen computers. Although the data on a particular portable computer may not be sensitive, if that computer is configured to connect to the corporate network, it can be used to gain access to the network, where sensitive data does exist.

In addition, the Security feature team could enable TPM protection on desktop computers that contain sensitive data in the field or in busy, unsecured environments such as kiosks or retail branch. These computers’ users include executives, researchers, and other mission-critical users who have highly sensitive data on their computers.

Documenting Policies for Information Protection

Designing a BitLocker Drive Encryption infrastructure also involves developing support procedures and establishing a system of checks and balances for administrative authority. Only by effectively addressing both the technical and the administrative issues related to the BitLocker Drive Encryption infrastructure can the Security feature team ensure that services provide the level of security that the organization requires.

In general, the Security feature team is responsible for setting and maintaining data protection policies and practices. However, because of the legal, financial, and tactical uses of data protection, representatives from outside the IT department, such as human resources, finance, legal, and marketing, might also be involved in establishing information security policies.

A data protection policy is a set of rules that indicates the applicability of a data protection service to a particular group of clients that have common security requirements. Data protection policy statements generally include the following types of information:

  • Legal issues, such as liability, that might arise if the data is either compromised or used for something other than its intended purpose

  • Key management requirements

  • Whether the recovery key can be exported or archived

  • Minimum length for the key pairs

  • Cryptographic algorithms, cryptographic service providers (CSPs), and key lengths

Creating a Hardware and Software Specification

Windows Vista BitLocker Drive Encryption and the dependent TPM technology require either a TPM-equipped motherboard and basic input/output system (BIOS) extension as part of the computer hardware standard or a separate USB flash drive. TPM is the preferred technology, because it provides the best balance between data protection and usability. USB flash drives can be easy to lose (which would prevent users from accessing their computers). Additionally, users must store the USB flash drive separately from the computer (to prevent a thief from stealing both). Storing the flash drive separately further increases the risk of loss.

If the Security feature team opts to use TPM technology, the organization’s TPM system requirements could include the following:

  • Hardware:

    • TPM v1.2. The TPM enables platform integrity measurement and reporting. It requires chipset support for a secure TPM interface.

    • Conventional/Extensible Firmware Interface (EFI) BIOS–TCG compliant. A compliant BIOS establishes a chain of trust for the pre-operating system boot process. The BIOS must support the TCG-specified Static Root of Trust Measurement (SRTM). BitLocker Drive Encryption supports both conventional and EFI BIOS.

  • Software:

    • Windows Vista

Creating a BitLocker Drive Encryption Deployment Plan

Windows Vista BitLocker Drive Encryption technology has enterprise-wide deployment and management capabilities that the Windows Management Instrumentation (WMI) interface exposes. Control BitLocker Drive Encryption through the Windows Security Center, which requires local administrative privileges. Several Group Policy settings are specific to TPM management. These Group Policy settings enable or disable TPM functionality for computers in the organization.

Planning Recovery

Recovery planning is critical when using any type of encryption, because a lost encryption key almost certainly means lost data. Specifically, consider the following topics:

  • Recovery scenarios

  • Migration scenarios

  • Backup and restore

  • Hardware failures and changes

  • Data lockouts

  • Virus attacks

  • Operating system changes

Planning EFS

Like BitLocker Drive Encryption, the EFS uses encryption and therefore requires planning to ensure data is not lost in the event that a File Encryption Key (FEK) is lost. The sections that follow provide high-level information about planning to use EFS on client computers.

Determining EFS Requirements

EFS technology can help the organization meet various security requirements. For example, use EFS for protecting:

  • Mobile data (removable media, Web-based Distributed Authoring and Versioning [WebDAV]).

  • Against untrusted users or administrators.

  • Remote file and folders.

  • Sensitive user data on shared computers.

  • Sensitive roaming documents on enterprise file shares.

  • Documents shared among small work groups.

For Windows Vista EFS features, refer to Windows Vista Security and Data Protection Improvements at http://www.microsoft.com/technet/windowsvista/evaluate/feat/secfeat.mspx.

Encrypting the My Documents folder (%USERPROFILE%\My Documents) is recommended for all portable computers. Doing so ensures that personal folders in which most Microsoft Office documents are saved are encrypted by default. Encrypt folders rather than individual files. Applications work on files in various ways; for example, some applications create temporary files in the same folder during editing. These temporary files might or might not be encrypted, and some applications substitute them for the original when the edit is saved. Encrypting at the folder level ensures that files do not get decrypted transparently in this way. The Security feature team can also use EFS to protect sensitive roaming documents on enterprise file shares. For detailed information on EFS planning, refer to the EFS section in the Windows XP Resource Kit http://www.microsoft.com/resources/documentation/Windows/XP/all/reskit/en-us/prba_dwp_jzyf.asp.

Creating an EFS Deployment Plan

All EFS users must have valid certificates for use with EFS. EFS can create and self-sign these certificates if no certification authorities (CAs) are available. Users can also request certificates from an enterprise CA. Domain environments can be configured so that EFS works just as it does in a stand-alone environment, creating self-signed certificates for users. Enterprise CAs can also be configured in the domain to create certificates for users. Windows Vista supports EFS certificates on smart cards to enhance key protection and security.

For detailed information on EFS deployment planning, refer to the EFS section in the Windows XP Professional Resource Kit at http://www.microsoft.com/resources/documentation/Windows/XP/all/reskit/en-us/prba_dwp_jzyf.asp.

Creating an EFS Recovery Plan

Encrypting a file always includes a risk that it cannot be read again. The owner of the private key, without which a file cannot be decrypted, might leave the organization without decrypting all of his or her files. Worse yet, he or she might intentionally or accidentally encrypt critical shared files so that no one else can use them. A user’s profile might be damaged or deleted, meaning that the user no longer has the private key needed to decrypt the file’s FEK. Because losing data is often disastrous, several methods of recovering data are available when encrypted files cannot be decrypted: data recovery agents, export and import of EFS recovery keys, and recovering backups.

For detailed information on EFS recovery planning, refer to the EFS section in the Windows XP Professional Resource Kit at http://www.microsoft.com/resources/documentation/Windows/XP/all/reskit/en-us/prba_dwp_jzyf.asp.

Planning RMS

Every client computer that will participate in the RMS system must be set up so that it is established as a trusted entity within the RMS system. Client computer setup consists of verifying the presence of the Windows Rights Management client component and activating the client computer. After a client computer is set up, the infrastructure is in place to permit users with RMS-enabled applications to publish and consume rights-protected information.

Each client computer must have the Windows Rights Management client component installed. This component is built into Windows Vista and is available from the Microsoft Update Catalog or the Microsoft Download Web site for Windows XP.

Organizations can use standard software deployment tools such as SMS to ensure that their client computers have the component installed. This component is required by RMS-enabled applications and is used for the client computer activation process. RMS also requires a server infrastructure. The server infrastructure tracks encryption keys and authorized users and distributes keys as required.

For detailed information about planning RMS, refer to the Microsoft RMS Web site at http://www.microsoft.com/windowsserver2003/technologies/rightsmgmt. For information about disaster recovery, read “Disaster Recovery for Windows Rights Management Services” at http://www.microsoft.com/windowsserver2003/techinfo/overview/rmsrecovery.mspx.

Restricting the Use of Removable Storage Devices

The capacity and convenience of recent portable removable storage devices such as USB flash memory has continued to increase, but from the point of view of information leaks, these improvements present a problem, because they make it easier to copy and walk away with personal and confidential information. To prevent users from installing such devices on Windows Vista, configure Group Policy settings to allow or deny installation of specific device IDs or device classes or to deny installation of removable devices. These Group Policy settings are located under Computer Configuration\Administrative Templates\Classic Administrative Templates (ADM)\System\Device Installation\Device Installation Restrictions.

For more information, read the Microsoft TechNet document, “Step-By-Step Guide to Controlling Device Installation and Usage with Group Policy,” at http://www.microsoft.com/technet/windowsvista/library/9fe5bf05-a4a9-44e2-a0c3-b4b4eaaa37f3.mspx.

Planning Internet Explorer Security

Because Web browsers enable users to actively exchange important information and programs through the Internet and the intranet, consider the security requirements needed to protect users’ privacy and the contents of their exchanges. Internet Explorer provides a variety of features to help users ensure the privacy of their information and the safety of their work environment. Security feature team members can pre-configure these security and privacy options as part of custom browser packages. When pre-configuring these settings, Security feature team members have the option of locking them down, thus preventing users from changing them.

Implement the following options, depending on users’ security and privacy needs:

  • Privacy preferences. Define privacy preferences for disclosing personal information to Web sites. These privacy settings are based on the Platform for Privacy Preferences 1.0 (P3P1.0) specification, which provides a way to control how Web sites that users visit use their personal information. When navigating to Web sites, Internet Explorer determines whether to disclose the user’s personal information based on the user’s privacy preferences and the site’s privacy policy information. Privacy preferences also determine whether Web sites can store cookies on the user’s computer.

  • Security zones. Use Internet Explorer security zones to divide the Internet and intranet into four groups of trusted and untrusted areas and to designate the particular safe and unsafe areas to which specific Web content belongs. This Web content can be any item, from a Hypertext Markup Language (HTML) or graphics file to an ActiveX control, a Java™ applet, or an executable program.

  • Security levels. After establishing zones of trust, set browser security levels for each zone. Then, control settings for ActiveX controls, downloading and installation, scripting, cookie management, password authentication, cross-frame security, and Microsoft virtual machine (VM) capabilities based on the zone to which a site belongs.

  • Digital certificates. To verify the identity of individuals and organizations on the Web and to ensure content integrity, Internet Explorer uses industry-standard digital certificates and Microsoft Authenticode® 2.0 technology. Together with security zones, use certificates to control user access to online content based on the type, source, and location of the content. For example, use security zones in conjunction with certificates to give users full access to Web content on the organization’s intranet but limit access to content from restricted Internet sites.

  • Content ratings. Use the Internet Explorer Content Advisor to control the types of content that users can access on the Internet. Adjust the content rating settings to reflect the appropriate content in four areas: language, nudity, sex, and violence. Also, control access by specifying individual Web sites as approved or disapproved for user viewing.

  • Permission-based security for Microsoft VM. Internet Explorer provides permission-based security for Microsoft VM with comprehensive management of the permissions granted to Java applets and libraries. Enhanced administrative options include fine-grained control over the capabilities granted to Java code, such as access to scratch space, local files, and network connections. Use these options to give an application some additional capabilities without offering it unlimited access to every system capability.

Planning Application Security

Applications add capabilities to an operating system, but they can also add vulnerabilities. For example, many viruses have penetrated internal networks by exploiting weaknesses in e-mail clients and Web browsers. When designing a client platform to meet security requirements, consider the applications that will run on that platform and evaluate the security strengths and weaknesses of competing applications. Specifically, consider the following factors:

  • Update methodology. Just as the Security feature team must plan to distribute operating system updates as Microsoft releases them to mitigate the risk of newly discovered vulnerabilities, the team must also plan to distribute updates for every application deployed. Any application can have security vulnerabilities. Discuss with potential software vendors how they release updates when vulnerabilities are discovered and which deployment methods are the best. The Security feature team must consider how it will integrate the software vendor’s update schedule with the organization’s update process. Microsoft releases updates for some applications, such as Internet Explorer and Microsoft Office programs, by using the same mechanisms used to release operating system updates.

  • Secure development methodology. Although any application can contain software vulnerabilities, many non-Microsoft software companies lack processes for auditing code for software vulnerabilities. Depending on the organization’s development methodologies, it may even be possible for a single developer to intentionally inject malicious code into an application without its ever being reviewed by upper management. Although Microsoft has processes in place to limit the occurrence of unintentional software vulnerabilities and malicious acts, discuss development methodology with any potential software vendor to ensure its practices meet security requirements. For more information, see Writing Secure Code at http://msdn.microsoft.com/security/securecode.

  • Manageability. The Security feature team might have to modify an application’s default configuration to make the application meet security requirements. The team can do this as part of the regular application deployment process, but the team must also consider ongoing manageability of the application. If a newly discovered vulnerability forces the team to change a configuration setting in the application, how will team members update it on every client computer that runs the application? How will team members prevent users from changing settings that might reduce the security of an application? Some applications, such as Internet Explorer and the Microsoft Office System, team members can manage by using Group Policy settings—an ideal way to maintain application security configurations.

Note   When maintaining multiple versions of an application, Security feature team members may have to use multiple techniques to manage the settings for that application. For example, the team can use GPOs to specify policy settings for Internet Explorer if clients are using Windows XP or Windows Vista. For clients running earlier versions of the Windows operating system with Internet Explorer, the team may have to use the Windows Internet Explorer Administration Kit (IEAK), available at http://www.microsoft.com/technet/prodtechnol/ie/ieak, to maintain settings.

Planning Windows Defender

Windows Defender is a security technology that helps protect Windows users from spyware and other potentially unwanted software by detecting and removing known spyware on users’ computers. This tool helps reduce the negative effects that spyware causes, including slow computer performance, annoying pop-up ads, unwanted changes to Internet settings, and unauthorized use of private information.

Continuous protection improves Internet browsing safety by guarding more than 50 ways spyware can enter users’ computers. Participants in the worldwide Microsoft SpyNet community play a key role in determining which suspicious programs are classified as spyware. Microsoft researchers quickly develop methods to counteract these threats, and updates are automatically downloaded to users’ computers so that they stay up-to-date. While there is little risk of submitting this information to the SpyNet community, it is possible for private information to be revealed. Weigh this potential risk against the benefits of participating in SpyNet.

To protect the organization’s desktop computers from spyware and other unwanted software, use Group Policy to enable the Windows Spyware service on computers through the organization. For more information about Windows Defender, visit http://www.microsoft.com/athome/security/spyware/software. To distribute updated Windows Defender signatures, use Microsoft Windows Server Update Services (WSUS), as described in “Choosing an Update Methodology” later in this guide.

Non-Microsoft Security Applications

Even environments with minimal security requirements must add non-Microsoft software to their client computers. Address the following security needs that may be lacking in Windows clients:

  • Virus protection. Viruses are the most frequent threat on networks running the Windows operating system, and antivirus software must be running on every client computer, regardless of security requirements. If the organization has no budget to allocate to virus protection, find a free antivirus solution. Most enterprises require a sophisticated antivirus solution that gives administrators centralized control over the antivirus configuration and that automatically updates antivirus signatures. Visit Microsoft Antivirus Partners at http://www.microsoft.com/security/partners/antivirus.asp for a list.

  • Network backups. Backing up the data on client computers is critical for security and disaster recovery but challenging. Enterprises typically purchase non-Microsoft network backup software. Then, they install backup agents on all client computers and configure critical data (such as My Documents and the system state) to do backup across the network every night to a central backup server. The backup server typically stores the data on multiple hard disks for quick restorations and copies data to removable backup tapes that can be shipped off-site to protect the data in the event of a fire or other catastrophe.

Choosing an Update Methodology

Plan to update every network component that uses software. Of these network components, client computers will be the most challenging to update, because enterprises have them in large numbers, because they run a variety of applications that require updating, and because they are frequently disconnected from the network.

To keep systems up-to-date

  1. Assemble an update team.

  2. Inventory all software in the organization.

    The Application Compatibility feature team will have already created a software inventory that the Security feature team can use as a foundation. Additionally, some of this information, including operating system and installed applications, can be gathered in an automated fashion by using such tools as SMS. For information about using SMS for software updates, see the Microsoft TechNet document, “Patch Management Using Systems Management Server 2003,”* at http://www.microsoft.com/technet/itsolutions/cits/mo/swdist/pmsms/2003/pusmscg3.mspx. The Security feature team can also inventory Microsoft software on a computer by using the Microsoft Software Inventory Analyzer (MSIA). The MSIA is available for download at no charge from http://www.microsoft.com/resources/sam/msia.mspx. For instructions on using MSIA, read “Appendix D. Using the Microsoft Software Inventory Analyzer.” For information on non-Microsoft software inventory tools, see Find a Tool at *http://www.microsoft.com/resources/sam/aspx/findtool.aspx.

  3. Contact each software vendor and determine its process of notifying customers of software updates.

    Some vendors send out update notifications by e-mail, whereas others require checking a Web site regularly.

  4. Assign individuals to identify software updates on a regular basis.

    For example, someone on the team should be responsible for checking every software vendor’s Web site for new updates at least weekly.

  5. Create an updating process for evaluating, retrieving, testing, installing, auditing, and removing updates.

The sections that follow discuss three common update deployment technologies: Microsoft Update, WSUS, and SMS. Additionally, consider investigating DesktopStandard’s PolicyMaker™ Software Update—a non-Microsoft update management tool—at http://www.desktopstandard.com/PolicyMakerSoftwareUpdate.aspx.

Microsoft Update

Microsoft Update is a free Microsoft service for keeping computers running the Windows operating system up-to-date with the latest Microsoft operating system and application updates. Microsoft Update consists of three components: the Microsoft Update Web site, the Automatic Update client, and the Microsoft Update Catalog. Millions of people use the Microsoft Update Web site (http://windowsupdate.microsoft.com) each week as a way to keep their Windows-based systems current. When a user connects to the Microsoft Update site, Microsoft Update evaluates the user’s computer to check which software updates and updated device drivers should be applied to keep the system secure and reliable. Microsoft Update is not recommended for enterprises, however, because it requires individual users to manage their own updates, and it does not use bandwidth efficiently.

The Microsoft Update Web site includes a catalog of all software update installation packages for download. These software update installation packages can then be stored on a CD, distributed, and installed through other means, such as SMS or non-Microsoft software distribution tools, or they can be used when installing new computers.

Windows Server Update Services

WSUS is a simplified version of Microsoft Update that the organization can host on its private network for critical updates and security updates. WSUS connects to the Microsoft Update site; downloads critical updates, security updates, and service packs; and adds them to a list of updates that require administrative approval. WSUS then notifies administrators by e-mail that new updates are available. After an administrator has approved and prioritized these updates, WSUS automatically makes them available to any computer running Automatic Updates. Automatic Updates (when properly configured) then checks the WSUS server and automatically downloads and installs updates as configured by the administrators. As shown in Figure 6, WSUS can be distributed across multiple servers and locations to scale to enterprise needs. WSUS meets the needs of medium-sized organizations and many enterprises.

Figure 6. Typical enterprise WSUS architecture

Figure 6. Typical enterprise WSUS architecture

For instructions on configuring WSUS, read “Appendix E. Configuring Windows Server Update Services.” For more information about update management with WSUS, see the Microsoft TechNet document, “Patch Management Using Microsoft Software Update Services” at http://www.microsoft.com/technet/itsolutions/cits/mo/swdist/pmsus/pmsus251.mspx. (Software Update Services, or SUS, is the previous version of WSUS, but most of the content is still applicable.)

Systems Management Server

SMS provides a variety of tools to help deploy updates to an enterprise. With the software distribution feature of SMS, the Security feature team can automatically update all SMS client computers in the organization with a new update. The team can allow users to run the update installation whenever they like, or the team can schedule the update installation to run at a specific time. The team can also schedule it to run on SMS client computers at a time when users are not logged on.

For more information about SMS, visit the SMS home page* *at http://www.microsoft.com/smserver. For information about using SMS for update management, see the Microsoft document, “Patch Management Using Systems Management Server 2003,” at http://www.microsoft.com/downloads/details.aspx?FamilyId=E9EAB1BD-13E7-4E25-85C5-CE2D191C3D63&displaylang=en.

Infrastructure and Deployment Security

Although this guide focuses on deploying desktops with the best possible security configurations, the security of the deployment infrastructure itself must also be considered. The sections that follow do not attempt to provide comprehensive coverage for protecting the deployment infrastructure. Instead, these sections are intended to enlighten Security feature team members to possible security risks so that they can mitigate them as part of the security risk management process. Where possible, the following sections refer readers to more detailed documentation.

For detailed information about protecting the deployment environment, refer to the following appendices:

  • “Appendix F: Restricting File Permissions on Deployment Servers”

  • “Appendix G: Configuring Security for Domain User Accounts”

  • “Appendix H: Using MBSA to Audit the Security Configuration of Deployment Servers”

  • “Appendix I: Using Port Scanning Tools to Audit the Security Configuration of Deployment Servers”

Protecting Deployment Staging Areas

Staging areas where images are created, updated, and maintained pose a significant potential vulnerability. First, because computers in the staging area (including computers that haven’t been updated and would not meet the organization’s security requirements) are likely to run with varying degrees of security, there is an elevated risk of the computers being compromised. In particular, protect those computers from worms and viruses by placing them on an isolated network segment.

Using a perimeter firewall alone is not sufficient: Computers in the deployment staging area must be on a separate network that production computers cannot reach. If computers on the internal network can route traffic to the deployment staging area, there is a high risk that deployment staging servers will be infected with a worm. Worms are common on internal networks, because portable computers may become infected while connected to untrusted networks.

Second, because these images form the basis for all new computers in the organization, a compromised image can have a widespread effect and a very high cost. Use the security risk management process (discussed earlier in this document) to evaluate this risk and assign resources to mitigate any vulnerabilities.

Third, staging areas might contain credentials (user names and passwords) used to automatically authenticate computers during the setup process. Protect these credentials to reduce the risk of an attacker abusing them. For more information about protecting servers that host images and other infrastructure components in the staging area, refer to the Windows Server 2003 Security Guide at http://www.microsoft.com/technet/security/prodtech/windowsserver2003/W2003HG/SGCH00.mspx.

The Security feature team cannot completely eliminate the risk of a security compromise. Therefore, the team must plan to identify and track attacks. Security auditing, built into all recent versions of the Windows operating system, is a useful tool for recording user actions. For more information, see Auditing Security Events Best Practices at http://www.microsoft.com/technet/prodtechnol/windowsserver2003/library/ServerHelp/5658fae8-985f-48cc-b1bf-bd47dc210916.mspx in the Windows Server 2003 Help and Support Center.

Additionally, consider a non-Microsoft intrusion-detection system to meet the organization’s security requirements. For information about how Microsoft uses MOM and non-Microsoft tools for auditing, read the white paper, “Incident Response: Managing Security at Microsoft,” at http://www.microsoft.com/technet/itsolutions/msit/security/msirsec.mspx.

Protecting Production Deployment Servers

Deployment servers must be protected as well. Like the servers in the staging environment, deployment servers typically store configuration files, which might include user credentials. Protect these servers with physical controls: Only authorized personnel should have physical access to production or development deployment servers, and even then it is better to forbid a single user access to the server and instead always require collusion.

Reduce the attack surface of the server by limiting the services that are running. For example, although the File Server role is probably necessary, definitely do not install the Application Server role on a deployment server. For the roles not installed, including the File Server role, identify security guides, such as the Windows Server 2003 Security Guide at http://www.microsoft.com/technet/security/prodtech/windowsserver2003/W2003HG/SGCH00.mspx, which will help team members harden the deployment server to meet security requirements.

If possible, disallow remote logon entirely. If that is not possible, restrict remote access to the deployment server to a small group of trusted staff. Use network filtering (such as that included in Windows Firewall) to restrict network connections to those originating from clients on LANs.

Again, consider reducing the risk of a single administrator making malicious changes or installing malicious software in operating-system images by requiring collaboration. One easy way to require collaboration is to configure deployment servers to allow only a handful of special user accounts to change images. For each of those user accounts, have two different administrators each type half of the user account’s password. To successfully authenticate and change an image, two administrators must work together. In this way, the Security feature team can significantly reduce the risk of a single disgruntled employee compromising the servers.

To reduce the risk of password-cracking attacks, consider implementing multifactor authentication by requiring a password to be used in conjunction with a smart card or biometric authentication (such as a fingerprint scanner). The Security feature team need not necessarily deploy multifactor authentication to an entire enterprise; the team can require multifactor authentication only for the most critical computers on the network, such as staging and production deployment servers.

At a minimum, use the Microsoft Baseline Security Analyzer (MBSA) to audit the security configuration of deployment servers. For instructions on using MBSA, read “Appendix H. Using MBSA to Audit the Security Configuration of Deployment Servers.” The MBSA is available for download from http://www.microsoft.com/technet/security/tools/mbsahome.mspx. The Microsoft Office Visio® 2003 Connector for MBSA, available at http://www.microsoft.com/technet/security/tools/mbsavisio.mspx, can help visualize the results of the audit. To further assess points of entry for network attacks, use a port scanning tool to identify the types of inbound connections servers are listening for. For information on how to assess the results of a port scanning tool, read “Appendix I. Using Port Scanning Tools to Audit the Security Configuration of Deployment Servers.”

Protecting Windows PE and Client Deployment Scripts

Most organizations use Microsoft Windows Preinstallation Environment (Windows PE) during the client deployment process. Like any software, Windows PE has the potential to contain vulnerabilities. As a result, plan to keep Windows PE updated with the latest security updates and thoroughly test Windows PE to verify that the updates have not affected the functionality. Additionally, consider using network security to protect Windows PE during the deployment process when clients are connected to the network. Windows PE 2.0 and later include Windows Firewall to further reduce the risk of network attacks. If possible start Windows PE when clients are connected to a network segment that has extremely limited access.

Consider security when developing scripts that will run within Windows PE. Whenever possible, avoid adding user credentials in clear text directly to the script. If Security feature team members must store credentials in a script, they should store them with reversible encryption to prevent a casual observer from identifying the user name and password for an account. Additionally, team members must protect the scripts from unauthorized access by using file and share permissions. Finally, use code reviews for Windows PE scripts just as for a custom application. Code reviews can prevent a single staff member from adding malicious code to a script.

For more information about Windows PE, visit the Microsoft Volume Licensing site for Windows PE at http://www.microsoft.com/licensing/programs/sa/support/winpe.mspx.

Other Infrastructure Security Considerations

Other infrastructure security considerations include:

  • Reducing the risk of clients being exploited between the time they are deployed and the time they are up-to-date with the latest security updates.

  • Designing the network infrastructure to protect both newly deployed clients and the deployment infrastructure.

  • Protecting the integrity of images so that they do not accidentally or intentionally become infected with malicious software, backdoors, or weakened security.

  • Hardening servers that participate in the deployment infrastructure, such as Windows Deployment Services (Windows DS) servers. For information about planning security for RIS servers, read “Planning RIS Network Security” at http://www.microsoft.com/technet/prodtechnol/windowsserver2003/library/DepKit/f5de346f-2193-4eb3-9635-6d24ac005028.mspx in the Windows Server 2003 Help and Support Center.

  • Replicating operating-system images to different distribution points in the enterprise while minimizing the risk that one of the images will be maliciously modified or abused. For more information, read “Planning File Server Security” at http://www.microsoft.com/technet/prodtechnol/windowsserver2003/library/DepKit/ad7a535a-c663-4f12-b5b2-4c11ac28c0b0.mspx in the Windows Server 2003 Deployment Guide. If using Distributed File System (DFS) and File Replication Service (FRS), pay particular attention to the subsection titled “Planning DFS and FRS Security” at http://www.microsoft.com/technet/prodtechnol/windowsserver2003/library/DepKit/09f11266-8675-4dd6-8848-a5dd6c383967.mspx.

  • Auditing and monitoring deployment infrastructure servers to detect and trace changes that might affect security.

  • Protecting sensitive files that might include passwords, such as:

    • Unattend.xml. Could contain the local Administrator password and potentially credentials to join the domain (Applies to Windows Vista only.)

    • Unattend.txt. Could contain the local Administrator password (Applies to Windows XP only.)

    • Sysprep.inf. Could contain the local Administrator password and potentially credentials to join the domain

    • Builds.xml. Could contain the local Administrator password

    • CustomSettings.ini. May contain passwords if configured by an administrator

  • Protecting profile and data security, especially when using the USMT.

  • Preventing an authorized user from making malicious changes to the deployment infrastructure or the desktop client platform.

Milestone: Migration Plan Accepted

Milestones are synchronization points for the overall solution. See the Plan, Build, and Deploy Guide.

At this milestone, the Security feature team has chosen a strategy for policies and a security level for the organization. This milestone requires the deliverables listed in Table 3.

Table 3. Migration Plan Accepted Milestone Deliverables

Deliverable ID

Description

Policy strategy

A determination and recommendation about whether to use centralized GPOs or local GPOs

Security level

A recommendation about how to configure security settings for client computers, including choosing a base security template

Windows Firewall configuration

A recommended configuration for Windows Firewall

Download

Get the Microsoft Solution Accelerator for Business Desktop Deployment 2007

Update Notifications

Sign up to learn about updates and new releases

Feedback

Send us your comments or suggestions