Creating a Test Plan Defining Testing Scope and Objectives In the scope and objectives section of the test plan, the testing team describes specifically what you want you’re testing to accomplish. For example, the objective of one team was to migrate the Microsoft® Windows NT® Server 4.0 operating system environment to a Windows Server 2003 environment, component by component, keeping the access control lists (ACL’s) and Exchange permissions intact. Another team’s objective was to provide a means to measure network traffic and observe server performance during specific directory service tasks. Also, you need to define the scope of your testing by identifying what you will test and what you will not. For example, you might limit your testing of client computer hardware to the minimum supported configurations or to the standard configurations. Defining Testing Methodology Describe the general methodology that your team will use for testing. For example, your approach to testing schema changes might be to configure an isolated domain in the test lab where schema changes can be applied without affecting other lab tests. This section of the test plan might address the following: • The domain architecture used for testing • The tools and techniques used to conduct the tests or to measure results • Automated techniques you plan to use during testing Identifying Required Resources Identify the resources required for the test lab: Hardware List the hardware that you need to conduct tests. For example, identify the standard configurations you plan to support for client computers. Include components such as video cards, modems, and external drives. Software List the software that you need to test the compatibility of your applications with Windows Server 2003. For example, include Microsoft® Systems Management Server (SMS) or other server-based products that you need for testing. For more information about software required for application compatibility testing, see "Planning and Testing for Application Deployment" in this book. Databases Include databases that you need to prepare for testing applications. Also include a description of the resources that you need to populate the databases, such as personnel and business data. Personnel Identify the number of testers you need and the skill level required. Include consultants and other support personnel, as necessary. Training Specify the Windows XP Professional or Windows Server 2003 training that your testers need to complete prior to testing their assigned applications or technologies. Tools Include all tools or scripts that you need to automate testing and to track test results. For example, if you do not have a second test lab that you can use for testing wide area network (WAN) links, include link simulators. For more information about developing a test result tracking system, see "Developing an Incident-Tracking System" later in this chapter. Identifying the Features and Functions to Test List all the features or aspects of features that need to be tested. This part of the test plan describes what to test, not how to test. The following is an example of a feature test description: • Test 1 — Trust retention Description: All trusts to and from a domain should be retained when the domain controllers are upgraded to Windows Server 2003. Use the Domain Tree Manager to view the trusts. If the trusts do not appear, then the test failed. Note that the description does not include instructions on how to perform the test. Include tests that verify or address: • The functionality of each feature and service that you will implement. • Interoperability with existing components and systems in the production environment, both during the phase-in period, when there is a mix of old functionality and new Windows Server 2003 functionality and after the Windows Server 2003 environment has been rolled out. • Hardware and driver compatibility for every type of client computer that will be running Windows XP Professional. • Application compatibility for every application that will run on Windows XP Professional. • Application compatibility for every server application that will run on Windows Server 2003. • Optimization of configurations, such as those for standardized desktops on client computers. In addition, list: • Baselines (a range of measurements derived from performance monitoring that represents acceptable performance under typical conditions) for performance monitoring. • Baselines and stress tests for capacity planning. • Procedures for deployment and post-deployment administration, such as procedures for upgrading a client computer and for backing out of a faulty rollout process. • Required tools and utilities. Identifying Risk Factors Describe the risk factors that could prevent the successful completion of all required tests. For example, you might find that the test lab is behind schedule, or that required hardware or software is unavailable, or that testers are working on other projects or need additional training. After you have identified the risk factors, decide what you will do to avoid or mitigate each risk. Establishing a Testing Schedule Draft a preliminary schedule that includes each test listed in the test plan. The schedule can help you coordinate test lab use among sub teams. Assign a team member, ideally the test lab manager, if your team has one, to maintain and update the lab schedule. Having an up-to-date schedule is critical when unscheduled lab requests are submitted. Designing Test Cases A test case is a detailed procedure that fully tests a feature or an aspect of a feature. Whereas the test plan describes what to test, a test case describes how to perform a particular test. You need to develop a test case for each test listed in the test plan. Figure 2.10 illustrates the point at which test case design occurs in the lab development and testing process. Figure 2.10 Designing Test Cases A test case includes: • The purpose of the test. • Special hardware requirements, such as a modem. • Special software requirements, such as a tool. • Specific setup or configuration requirements. • A description of how to perform the test. • The expected results or success criteria for the test. Test cases should be written by a team member who understands the function or technology being tested, and each test case should be submitted for peer review. Organizations take a variety of approaches to documenting test cases; these range from developing detailed, recipe-like steps to writing general descriptions. In detailed test cases, the steps describe exactly how to perform the test. In descriptive test cases, the tester decides at the time of the test how to perform the test and what data to use. Most organizations prefer a detailed test case because determining pass or fail criteria is usually easier with this type of case. In addition, detailed test cases are reproducible and are easier to automate than descriptive test cases. This is particularly important if you plan to compare the results of tests over time, such as when you are optimizing configurations. Detailed test cases are more time-consuming to develop and maintain. On the other hand, test cases that are open to interpretation are not repeatable and can require debugging, consuming time that would be better spent on testing. Table 2.1 provides an example of the first few steps of a detailed test case. Table 2.1 Sample Detailed Test Case Step Procedure Success Criteria Outcome 1 Log off the server, and return to the net logon screen. None. 2 Click the domain list to open it. The local server name does not appear in the list. 3 Click the domain list to open it. The root domain appears in the list. 4 Log on to the server using an account with administrative credentials. The account logs on to the server without errors. When planning your tests, remember that it is not feasible to test everything. Instead of trying to test every combination, prioritize your testing so that you perform the most important tests — those that focus on areas that present the greatest risk or have the greatest probability of occurring — first. For example, you might choose to test the slowest client computer, the busiest server, or the least reliable network link. Then, if time allows, you can perform lower priority tests.
Ramdas Add
Friday, 10 February 2012
Creating a Test Plan Defining Testing Scope and Objectives In the scope and objectives section of the test plan, the testing team describes specifically what you want you’re testing to accomplish. For example, the objective of one team was to migrate the Microsoft® Windows NT® Server 4.0 operating system environment to a Windows Server 2003 environment, component by component, keeping the access control lists (ACL’s) and Exchange permissions intact. Another team’s objective was to provide a means to measure network traffic and observe server performance during specific directory service tasks. Also, you need to define the scope of your testing by identifying what you will test and what you will not. For example, you might limit your testing of client computer hardware to the minimum supported configurations or to the standard configurations. Defining Testing Methodology Describe the general methodology that your team will use for testing. For example, your approach to testing schema changes might be to configure an isolated domain in the test lab where schema changes can be applied without affecting other lab tests. This section of the test plan might address the following: • The domain architecture used for testing • The tools and techniques used to conduct the tests or to measure results • Automated techniques you plan to use during testing Identifying Required Resources Identify the resources required for the test lab: Hardware List the hardware that you need to conduct tests. For example, identify the standard configurations you plan to support for client computers. Include components such as video cards, modems, and external drives. Software List the software that you need to test the compatibility of your applications with Windows Server 2003. For example, include Microsoft® Systems Management Server (SMS) or other server-based products that you need for testing. For more information about software required for application compatibility testing, see "Planning and Testing for Application Deployment" in this book. Databases Include databases that you need to prepare for testing applications. Also include a description of the resources that you need to populate the databases, such as personnel and business data. Personnel Identify the number of testers you need and the skill level required. Include consultants and other support personnel, as necessary. Training Specify the Windows XP Professional or Windows Server 2003 training that your testers need to complete prior to testing their assigned applications or technologies. Tools Include all tools or scripts that you need to automate testing and to track test results. For example, if you do not have a second test lab that you can use for testing wide area network (WAN) links, include link simulators. For more information about developing a test result tracking system, see "Developing an Incident-Tracking System" later in this chapter. Identifying the Features and Functions to Test List all the features or aspects of features that need to be tested. This part of the test plan describes what to test, not how to test. The following is an example of a feature test description: • Test 1 — Trust retention Description: All trusts to and from a domain should be retained when the domain controllers are upgraded to Windows Server 2003. Use the Domain Tree Manager to view the trusts. If the trusts do not appear, then the test failed. Note that the description does not include instructions on how to perform the test. Include tests that verify or address: • The functionality of each feature and service that you will implement. • Interoperability with existing components and systems in the production environment, both during the phase-in period, when there is a mix of old functionality and new Windows Server 2003 functionality and after the Windows Server 2003 environment has been rolled out. • Hardware and driver compatibility for every type of client computer that will be running Windows XP Professional. • Application compatibility for every application that will run on Windows XP Professional. • Application compatibility for every server application that will run on Windows Server 2003. • Optimization of configurations, such as those for standardized desktops on client computers. In addition, list: • Baselines (a range of measurements derived from performance monitoring that represents acceptable performance under typical conditions) for performance monitoring. • Baselines and stress tests for capacity planning. • Procedures for deployment and post-deployment administration, such as procedures for upgrading a client computer and for backing out of a faulty rollout process. • Required tools and utilities. Identifying Risk Factors Describe the risk factors that could prevent the successful completion of all required tests. For example, you might find that the test lab is behind schedule, or that required hardware or software is unavailable, or that testers are working on other projects or need additional training. After you have identified the risk factors, decide what you will do to avoid or mitigate each risk. Establishing a Testing Schedule Draft a preliminary schedule that includes each test listed in the test plan. The schedule can help you coordinate test lab use among sub teams. Assign a team member, ideally the test lab manager, if your team has one, to maintain and update the lab schedule. Having an up-to-date schedule is critical when unscheduled lab requests are submitted. Designing Test Cases A test case is a detailed procedure that fully tests a feature or an aspect of a feature. Whereas the test plan describes what to test, a test case describes how to perform a particular test. You need to develop a test case for each test listed in the test plan. Figure 2.10 illustrates the point at which test case design occurs in the lab development and testing process. Figure 2.10 Designing Test Cases A test case includes: • The purpose of the test. • Special hardware requirements, such as a modem. • Special software requirements, such as a tool. • Specific setup or configuration requirements. • A description of how to perform the test. • The expected results or success criteria for the test. Test cases should be written by a team member who understands the function or technology being tested, and each test case should be submitted for peer review. Organizations take a variety of approaches to documenting test cases; these range from developing detailed, recipe-like steps to writing general descriptions. In detailed test cases, the steps describe exactly how to perform the test. In descriptive test cases, the tester decides at the time of the test how to perform the test and what data to use. Most organizations prefer a detailed test case because determining pass or fail criteria is usually easier with this type of case. In addition, detailed test cases are reproducible and are easier to automate than descriptive test cases. This is particularly important if you plan to compare the results of tests over time, such as when you are optimizing configurations. Detailed test cases are more time-consuming to develop and maintain. On the other hand, test cases that are open to interpretation are not repeatable and can require debugging, consuming time that would be better spent on testing. Table 2.1 provides an example of the first few steps of a detailed test case. Table 2.1 Sample Detailed Test Case Step Procedure Success Criteria Outcome 1 Log off the server, and return to the net logon screen. None. 2 Click the domain list to open it. The local server name does not appear in the list. 3 Click the domain list to open it. The root domain appears in the list. 4 Log on to the server using an account with administrative credentials. The account logs on to the server without errors. When planning your tests, remember that it is not feasible to test everything. Instead of trying to test every combination, prioritize your testing so that you perform the most important tests — those that focus on areas that present the greatest risk or have the greatest probability of occurring — first. For example, you might choose to test the slowest client computer, the busiest server, or the least reliable network link. Then, if time allows, you can perform lower priority tests.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment