Figure 2 shows the primary activities that occur during the Planning Phase. These activities and deliverables are crucial to the success of the testing process as well as to the validation of the BDD 2007 implementation in the test lab environment. The Test feature team plans its test activities and determines its strategies in part by examining the interaction among and operational effectiveness of the organization, its users, and internal IT processes as well as the business goals defined for the BDD 2007 implementation.
Figure 2. Activities during the Planning Phase
The Test feature team works closely with the various feature teams to provide objective validation of the BDD 2007 implementation and its components. The primary deliverable for the Test feature team during the Planning Phase is the test plan, which interacts with and affects three key elements of the BDD 2007 implementation:
The test plan influences the overall BDD 2007 project plan.
The test schedule, which is included in the test plan, affects the BDD 2007 project schedule.
The test lab requirements, which are included in the test plan, factor in to the hardware and facilities plan; these requirements also influence the BDD 2007 budget.
The various components that make up the test plan are not prepared in isolation. When developing the test plan, the Test feature team must collaborate with other BDD 2007 implementation teams and use information gained from multiple sources, including:
Documentation provided with BDD 2007.
Development plan for the BDD 2007 project.
Functional specification for the BDD 2007 project.
The Test feature team focuses on testing how the process and technology components operate as a solution in the test lab environment. This approach considers the broad business context while testing the BDD 2007 implementation’s build process, technical components, features, and functionality before deployment into the production environment. The result of this approach is a golden, or fully tested, solution including images and software packages that can be deployed to the computers in the organization.
On This Page
Roles and Responsibilities
Test Plan Considerations
Prepare the Test Plan
Milestone: Test Plan Developed and Accepted
Roles and Responsibilities
All six role clusters from the MSF Team Model play a role in the Planning Phase of the initiative. Table 1 lists those roles and defines the focus areas for each role cluster.
For more information about MSF Role Clusters, see Microsoft Solutions Framework at http://www.microsoft.com/technet/itsolutions/msf/default.mspx.
Table 1. Roles and Responsibilities During the Planning Phase
Test Plan Considerations
When preparing the test plan, Test feature team members must take several aspects of the implementation process into consideration, including:
Which test types are required for the different stages of solution implementation and specifically the responsibilities of the Test feature team in each test type.
The strategy used for testing in each test type.
The various team roles for the Test feature team members.
The required prerequisite activities for each team member.
Each aspect is covered in more detail below. Once again, this information is provided as a guide to be incorporated into the organization’s existing testing practices.
The preparation of complex technological solutions involves several types of tests. Each type has its own requirements and testing strategy. The solution becomes more refined as it progresses from test type to test type until it is has been thoroughly tested and is ready for introduction into the production environment.
The recommended test types for BDD 2007 include:
Unit testing. The first test type focuses on the analysis of a single solution component. As they prepare to build the overall solution, team members from each feature team begin analyzing the components for which they are responsible. At this point, they often install these components in isolated environments to validate the components’ capabilities. This test type is often performed by individuals on a single computer.
Functional testing. After the individual teams become more familiar with the technological components for which they are responsible, they move on to functional testing—validation that products and components work as designed. Guidance for this test type is derived from the functional specifications of the overall project.
Integration testing. The next test type integrates each of the various components that make up the solution into one cohesive whole. This testing is performed in the integration or test lab by the Test feature team itself.
Staging testing. Staging testing is an optional test type providing a final validation for the procedures used to implement the solution. Although there may be a margin of error in the preceding test types, there must be no errors here. When a solution is introduced into the production environment, it must be completely error-free to ensure its longevity and long-term supportability.
Pilot and production testing. The final stage of testing often involves the same components used in production. When the teams implement the solution in an actual production environment, they begin with a pilot deployment—targeting a small, representative population of users in the production environment to perform a final validation of all of the implementation strategies and procedures before deploying the system to the entire production environment. This test type focuses more on training, communications, and support strategies than on actual technological strategies, though these are also validated. Pilot is linked to production, because if all goes well, the technologies and components introduced for the pilot program become the components used in full production.
Figure 3 shows the different test types and illustrates the cyclical graduation process from type to type.
Figure 3. The five potential test types
The instruction sets—detailed step-by-step directions that outline how to perform an activity—for the preparation of each test type are derived from the information found in each feature team guide in BDD 2007. Each feature team must document these instructions and update them each time issues or errors are corrected. For this process to succeed, the different teams must run test cycles. At various stages in a test cycle, different sets of tests are executed for each of the five test types identified above. This iterative approach to testing provides the means to refine all processes and procedures before introducing them into the production network.
Testing with multiple test cycles ensures that issues found in test cycle N are resolved in regressive test cycle N + 1. This process ensures a high-quality solution and is the justification for the multiple test types that the Test feature team must use.
Each test type begins with the production of baseline images of the test computers. Instructions for the creation of these baseline computers are found in the various feature team guides in BDD 2007. As tests are performed and processes graduate from one test type to the next, these baseline images are updated with the findings of the previous type. These updates mean that baseline images improve with each test type. At the end of the complete test cycle, the solution reaches a stable state.
Pass or Fail Criteria
Before the first test execution cycle, the following criteria must be defined to ensure defect prevention and bug resolution:
All test cases must pass with expected results as outlined in the test case workbook.
A test case is considered to have passed if the actual result matches the expected result documented for the test case. An actual result that does not match the expected result must be treated as a failed test case, and a bug must be created with an assigned severity score and priority.
If a test case fails, the solution guidance is not necessarily defective. For example, misinterpretation of product documentation, incomplete documentation, or inaccurate documentation could cause failures. Each failure must be analyzed to discover its cause based on actual results, and the results must be described in project documentation as well as escalated to the correct feature team.
These criteria must be part of the testing plan. They can, of course, be supplemented with other criteria customized to meet the organization’s specific needs.
Identify Test Case Types
Finally, the test plan must include different types of test cases. Various tests are possible, but in the context of this solution, the following types are often the most meaningful:
Installation testing. These types of tests are used to verify that solution components are installed correctly. The solution components consist of all the code and tools provided with BDD 2007, along with the code and tools already used within the organization. This test case also covers elements such as performance and bandwidth utilization during installation.
Configuration testing. These tests are used to verify that all solution configuration options are present and correctly execute when invoked. For instance, during the deployment of a computer, post-build processes must operate properly. These processes include the application of additional software components as well as the restoration of a user’s personal settings through the Microsoft Windows User State Migration Tool (USMT). Because the solution consists of multiple configurations, ensure that each configuration works as expected.
Functionality testing. This testing provides basic verification of operating system and application functionality from the development and test perspective.
User acceptance testing. This testing ensures that the system as a whole performs as it should under typical user operation. User acceptance testing must also be performed on each application package that the Application Management feature team creates.
Security testing. This testing ensures that the security aspects of the solution are maintained throughout the testing cycle. Security tests include items such as the verification that passwords are not displayed in plain text in any component of the solution (or credentials that are displayed in plain text have single-purpose rights and permissions), logs do not capture credential information, and each team role in the solution, including users, has proper access rights to all the shared folders that make up the solution.
Preparing to Test the BDD 2007 Implementation
Team members are selected during the Envisioning Phase. During the Planning Phase, they must begin preparing for BDD 2007 testing. This testing involves the review of three key documentation sets. The review of these documentation sets also assists in the preparation of the test plan. The documentation sets are:
The BDD 2007 documentation. Specifically, this feature team guide, the Plan, Build, and Deploy Guide, and the Test Cases Workbook. Team members can review the other feature team guides as well, but these three guides are the most crucial.
The development plan for the BDD 2007 implementation must be reviewed in depth.
The functional specification that establishes a baseline for the BDD 2007 features and functionalities to be deployed into the production environment. The solution architect who belongs to the Development Role Cluster prepares this document in the Planning Phase. Members of other role clusters provide input in the preparation of this document.
The Test feature team relies on this documentation to:
Assess test requirements and define prerequisites for undertaking testing.
Estimate the time and resources required to test all the features being developed.
Provide feedback to developers about any specifications in the document that are unclear, ambiguous, or contradictory to prevent incorrect implementations resulting from misunderstanding.
Determine whether any specific solution features would be difficult or impossible to test.
In situations in which a particular feature cannot be tested, the Test feature team’s escalation process must include passing the issue to the core team. The core team then decides whether the feature must be redeveloped or retained and deployed with a release note informing users (IT Operations) that the feature could not be tested. Untested elements must be kept to a strict minimum.
At this stage, functional specification review and sign-off by the Test feature team confirm the team’s ability to test the solution. This sign-off is also a Planning Phase deliverable for the core team. (See the Plan, Build, and Deploy Guide for more information about functional specifications.)
Prepare the Test Plan
The Test feature team is responsible for developing the test plan for the BDD 2007 project. This test plan consists of several components. A sample test plan template is provided in the BDD 2007 documentation set and can be found in C:\Program Files\BDD 2007\Documentation\Job Aids\Test Plan.doc. The components most relevant to the Test feature team in the following sections.
The scope for the Test feature team’s activities is determined by the solution requirements and the functional specification. The test scope defines the range and type of testing activities used to validate BDD 2007 technology and processes. In the case of BDD 2007, technology denotes the scripts and tools that the Development feature team created, while processes refers to the methodologies, enabling technologies, and practices that the other BDD 2007 teams use. The test scope must also contain the types included in the test plan.
Lab Requirements for Each Test Type
The testing goal is to obtain certification for a product to be deployed into the production environment. Certification involves a comprehensive verification of all of the components that make up the product—software packages and operating system images as well as all the components that support deployment of the solution—according to predefined test parameters. When the test lab mirrors the production environment, BDD 2007 systems and applications can be certified with confidence, because the lab test result represents what is to be expected in the production environment.
The Test feature team is responsible for designing and ensuring that the BDD 2007 test lab accurately represents the production environment. The development plan and Infrastructure Remediation Feature Team Guide assist the team in determining the hardware, software, infrastructure, and facilities required in the test lab environment. Test lab requirements vary according to the types of tests performed.
For example, if the team is conducting feature testing only, it may consider limiting the amount and type of hardware requested for the lab. If, however, the team also plans to conduct stress testing, the lab’s hardware requirements may significantly increase. Virtual machines (VMs) are especially beneficial when testing and can be applied to most test types. For example, unit testing is often performed on the user’s own computer through the use of one or more VMs providing all the functionality required for the test. Functional testing can use the same mechanism or use a dedicated piece of hardware running multiple VMs. Similarly, the integration and staging test types can also rely on VMs for most of their tests. VMs can simulate servers, but deployed images must include at least one physical example of each computer type targeted for the production deployment. This combination ensures that deployed images include all the correct drivers. For more information about using running VMs in Virtual Server 2005, see Microsoft Virtual Server 2005 R2 at http://www.microsoft.com/windowsserversystem/virtualserver/default.mspx.
Bug Rating, Reporting, and Tracking
The Test feature team develops the method for reporting and tracking bugs. The mechanism for bug reporting and tracking must include features that allow the Test feature team to assign bugs to the right person or team, prioritize bugs, assign severity numbers to them, reopen closed bugs, link bugs, generate different views of the same bug, and create reports. The Test feature team is also responsible for developing a process for bug triage and creating a schedule to track the status of all test bugs. For more information about bug tracking, see “Appendix: Finalize the Test Lab.”
Change control ensures that the core team is aware of and agrees to any changes to lab hardware or software. The Test feature team follows the change control process established by the lead team and used throughout the BDD 2007 project life cycle. The test lead must post the status of hardware and software in the lab as well as the testing schedules, so the various feature teams are aware of all lab activity. The test lead must also have procedures in place to restore the lab to its original state at the completion of the project.
The test schedule is affected to a large extent by the BDD 2007 development schedule. The Test feature team may carry out unit tests of solution modules as the teams release them, or it may conduct only complete system tests after the Computer Imaging System feature team has completed the entire build process document and has created complete deployment images.
Based on its experience, the BDD 2007 Test feature team at Microsoft recommends conducting unofficial unit testing while the build process document is in development. This approach gives the Test feature team enough time to create relevant test cases and become familiar with the solution during the early stages of the Stabilizing Phase. In addition, these unit tests can also be used for base functionality testing.
The test schedule must include, at minimum, the following tasks:
Test environment setup
Preparation of high-level test scenarios
Detailed test case preparation
Number and duration of testing cycles
Risks and Dependencies
The Test feature team typically looks for and assesses the following types of risks, then factors them into the test plan and test schedule:
Lab requirements, as based on the development plan, may exceed the Test feature team’s budget and time allocations.
Test cases can be completed in parallel with input from the Development feature team and functional specifications.
The test lab build may not have been completed by the start of the Stabilizing Phase because of an inability to procure and install all the lab equipment.
The test lab does not properly reflect the production environment. For example, it may not include proper firewall configurations or even all the Group Policy objects (GPOs) found in production. The goal is to make the test lab as similar to production as possible.
Mitigation plans for each of these risks must also be part of the test plan. The above risks are examples of the most common risks, but depending on the organization, others may exist and require contingency plans.
The test plan must include a description of the tools the testers use. This tools list must include the various test scenarios, the test types, and the test cases to be used. The list must make reference to supporting tools, such as the bug tracking system, change control system, and any documentation system to be used, as well as the scheduling tools the team lead uses to control testing schedules.
Milestone: Test Plan Developed and Accepted
Milestones are synchronization points for the overall solution. For more information, see the Plan, Build, and Deploy Guide. At this milestone, shown in Table 2, a test plan is in place that includes a detailed understanding of network bandwidth and server capacity requirements.
Table 2. Deliverables
This document outlines the complete testing approach the Test feature team uses.