Software Testing Documentation

First define the minimum necessary software testing documentation that must be created and
do not forget about the KISS principle (an abbreviation for
Keep It Simple, Stupid) as a rule for all testing processes.
Dictionary of Software Testing Documentation

Test Documentation. (IEEE) Documentation describing plans for, or results of, the testing of a system or component, Types include test case specification, test incident report, test log, test plan, test procedure, test report.

Software Testing documentation, or Test Deliverables, may consist of the following documents:
  • Master Test Plan (sometimes it is possible to write separate documents for test planning: Unit test plan, Integration test plan, System test plan and Acceptance test plan)
    Definition of Test Plan. A high-level document that defines a software testing project
    so that it can be properly measured and controlled. It defines the test
    strategy and organized elements of the test life cycle, including resource
    requirements, project schedule, and test requirements.
  • Test cases design (see a sample of template on the articles page of this site)
    Definition of Test Case. A set of test inputs, executions, and
    expected results developed for a particular objective.
  • Test procedures
    Definition of Test Procedure. A document, providing detailed
    instructions for the [manual] execution of one or more test cases. [BS7925-1]
    Often called - a manual test script.
  • Test logs
    Definition of Test Log. A chronological record of all relevant
    details about the execution of a test.[IEEE]
  • Test data
    Definition of Test data. The actual (set of) values used in the test
    or that are necessary to execute the test. [Daniel J. Mosley, 2002]
  • Test summary report
  • Automated test scripts
  • Incident reports
  • Incident log But as a minimum you must have test strategy, test cases and test summary report. A sample of a Master Software Test Plan document contents: 1. Introduction 1.1. Purpose 1.2. Background 1.3. Scope 1.4. Project Identification 2. Software Structure 2.1. Software Risk Issues 3. Test Requirements 3.1 Features Not to Test 3.2 Metrics 4. Test Strategy 4.1. Test Cycles 4.2. Planning Risks and Contingencies 5.1. Testing Types 5.1.1. Functional Testing 5.1.2. User Interface Testing 5.1.3. Configuration Testing 5.1.4. Installation Testing 5.1.5. Volume Testing 5.1.6. Performance Testing 4.2. Tools 6. Resources 6.1. Staffing 6.2 Training Needs 7. Project Milestones 8. Deliverables 8.1. Test Assets 8.2. Exit criteria 8.3. Test Logs and Defect Reporting 9. References A good test strategy is the most important and can in some cases replace all test plan documents. A sample of testing strategy from James Bach's presentation "Test Strategy: What is it? What does it look like?". The purpose of a test strategy is to clarify the major tasks and challenges of the test project. Our test strategy will consist of the following general test tasks: - Understand the decision algorithm and generate a parallel decision
    analyzer using Perl or Excel that will function as a reference oracle
    for high volume testing of the app. - Create a means to generate and apply large numbers of decision scenarios to the product. This will be done either through the use of a GUI test automation
    system, if practical, or through a special test facility built into the
    product (if development is able to provide that), or through the direct
    generation of DecideRight scenario files that would be loaded into the product during test. - Review the documentation, and the design of the user interface and
    functionality for its sensitivity to user error that could result in a
    reasonable misunderstanding of decision parameters, analysis, or suggestions. - Test with decision scenarios that are near the limit of complexity allowed
    by the product. (We will investigate creating these scenarios automatically.) - Compare complex scenarios (Automatically, if practical). - Test the
    product for the risk of silent failures or corruptions in the decision analysis. - Using requirements documentation, user documentation, or by exploring the
    product, we will create an outline of product elements and use that to
    guide user-level capability and reliability testing of the product. The principal issues in executing this test strategy are as follows: - The difficulty of understanding and simulating the decision algorithm. - The risk of coincidental failure of both the simulation and the product. The difficulty of automating decision tests.
    You can write a separate document for software testing strategy : Sample of Test Strategy document contents
    1.	INTRODUCTION	
    1.1	PURPOSE	
    1.2	FUNCTIONAL OVERVIEW	
    1.3	CRITICAL SUCCESS FACTOR	
    1.4	TESTING SCOPE (TBD)	
               Inclusions	
               Exclusions	
    1.5	TEST COMPLETION CRITERIA	
    2.	TIMEFRAME	
    3.	RESOURCES	
    4.1	TESTING TEAM SETUP	
    4.2	HARDWARE REQUIREMENTS	
    4.3	SOFTWARE REQUIREMENTS	
    5.	APPLICATION TESTING RISKS PROFILE	
    6.	TEST APPROACH	
    6.1	STRATEGIES	
    6.2	GENERAL TEST OBJECTIVES:	
    6.3	APPLICATION FUNCTIONALITY	
    6.4	APPLICATION INTERFACES	
    6.5	TESTING TYPES	
    6.5.1	Stability	
    6.5.2	System	
    6.5.3	Regression	
    6.5.4	Installation	
    6.5.5	Recovery	
    6.5.6	Configuration	
    6.5.7	Security	
    7.	BUSINESS ARES FOR SYSTEM TEST	
    8.	TEST PREPARATION	
    8.1	TEST CASE DEVELOPMENT	
    8.2	TEST DATA SETUP	
    8.3	TEST ENVIRONMENT	
    8.3.1	Database Restoration Strategies.	
    9.	TEST EXECUTION	
    9.1	TEST EXECUTION PLANNING	
    9.2	TEST EXECUTION DOCUMENTATION	
    9.3	PROBLEM REPORTING	
    10.	STATUS REPORTING	
    10.1	TEST EXECUTION PROCESS	
    10.2	PROBLEM STATUS	
    11.	HANDOVER FOR USER ACCEPTANCE TEST TEAM	
    12.	DELIVERABLES	
    13.	APPROVALS	
    14.	APPENDIXES	
    14.1	APPENDIX A (BUSINESS PROCESS RISK ASSESSMENT)	
    14.2	APPENDIX B (TEST DATA SETUP)	
    14.3	APPENDIX C (TEST CASE TEMPLATE)	
    14.4	APPENDIX D (PROBLEM TRACKING PROCESS)	
    
    Software Test Strategy sample Software Test Strategy Document

    Sample of Test Evaluation Report document contents:
    1. Objectives	
    2. Scope	
    3. References
    4. Introduction	
    5. Test Coverage	
    6. Code Coverage	
    7. Suggested Actions	
    8. Diagrams	
    

    Sample of QA plan document contents: 
    
    1. INTRODUCTION	
    1.1. OVERVIEW OF PROJECT X
    1.2. PURPOSE OF THIS DOCUMENT	
    1.3. FORMAL REVIEWING	
    1.4. OBJECTIVES OF SYSTEM TEST	
    1.4.1. QUALITY ASSURANCE INVOLVEMENT	
    2. SCOPE AND OBJECTIVES	
    2.1. SCOPE OF TEST APPROACH - SYSTEM FUNCTIONS	
    2.1.1. INCLUSIONS	
    2.1.2. EXCLUSIONS	
    2.2. TESTING PROCESS	
    2.3. TESTING SCOPE	
    2.3.1. FUNCTIONAL TESTING	
    2.3.2. INTEGRATION TESTING	
    2.3.3. PERFORMANCE TESTING	
    2.3.4. REGRESSION TESTING	
    2.3.5. LOAD/STRESS TESTING	
    2.3.6. BUSINESS (USER) ACCEPTANCE TEST	
    2.4. BUILD TESTING	
    2.4.1. ENTRANCE CRITERIA	
    2.4.2. EXIT CRITERIA	
    3. TEST PHASES AND CYCLES	
    UNIT TESTING (CONDUCTED BY DEVELOPMENT): 
    INTEGRATION/FUNCTIONALITY TESTING: 
    REGRESSION TESTING: 
    NEGATIVE / POSITIVE TESTING: 
    AD HOC TESTING:
    PERFORMANCE TESTING: 
    3.1. ORGANIZATION OF SYSTEM TESTING CYCLES	
    3.2. SOFTWARE DELIVERY	
    3.3. FORMAL REVIEWING	
    4. SYSTEM TEST SCHEDULE	
    5. RESOURCES - TESTING TEAM	
    5.1. HUMAN	
    5.2. HARDWARE	
    HARDWARE COMPONENTS REQUIRED	
    5.3. SOFTWARE TEST ENVIRONMENT SOFTWARE	
    ERROR MEASUREMENT SYSTEM	
    6. ROLES AND RESPONSIBILITIES	
    6.1. MANAGEMENT TEAM	
    6.2. TESTING TEAM SETUP	
    6.3. BUSINESS TEAM	
    6.4. DEVELOPMENT TEAM	
    7. ERROR MANAGEMENT & CONFIGURATION MANAGEMENT	
    8. STATUS REPORTING	
    8.1. STATUS REPORTING	
    9. ISSUES, RISKS AND ASSUMPTIONS	
    9.1. ISSUES/RISKS	
    9.2. ASSUMPTIONS	
    10. FORMAL SIGNOFF	
    11. ERROR REVIEW	
    11.1. PURPOSE OF ERROR REVIEW TEAM.	
    11.2. ERROR REVIEW TEAM MEETING AGENDA.	
    11.3. CLASSIFICATION OF BUGS	
    11.4. PROCEDURE FOR MAINTENANCE OF ERROR MANAGEMENT SYSTEM.	
    11.5. QUALITY ASSURANCE MEASURES	
    (I) DATES.	
    (II) EFFORT.	
    (III) VOLUME.	
    (IV) QUALITY.	
    (V) TURNAROUND.	
    

     User Acceptance Test (UAT) Plan Table of Contents: 
    
    1.	INTRODUCTION	
    1.1	PURPOSE	
    1.2	FUNCTIONAL OVERVIEW	
    1.3	CRITICAL SUCCESS FACTORS	
    1.4	UAT SCOPE	
    1.5	TEST COMPLETION CRITERIA	
    2.	TIMEFRAME	
    3.	RESOURCES	
    3.1	TESTING TEAM	
    3.2	HARDWARE TESTING REQUIREMENTS	
    3.3	SOFTWARE TESTING REQUIREMENTS	
    4.	TEST APPROACH	
    4.1	TEST STRATEGY	
    4.2	GENERAL TEST OBJECTIVES:	
    4.3	BUSINESS AREAS FOR SYSTEM TEST	
    4.4	APPLICATION INTERFACES	
    5.	TEST PREPARATION	
    5.1	TEST CASE DEVELOPMENT	
    5.2	TEST DATA SETUP	
    5.3	TEST ENVIRONMENT	
    6.	UAT EXECUTION	
    6.1	PLANNING UAT EXECUTION	
    6.2	TEST EXECUTION DOCUMENTATION	
    6.3	ISSUE REPORTING	
    7.	HANDOVER FOR UAT ACCEPTANCE COMMITTEE	
    8.	ACCEPTANCE COMMITTEE	
    9.	DELIVERABLES	
    10.	APPROVALS	
    11.	APPENDIXES	
    11.1	APPENDIX A (TEST CASE TEMPLATE)	
    11.2	APPENDIX B (SEVERITY STRATEGY)	
    11.3	APPENDIX C (ISSUE LOG)	
    

    Software Risk management document Contents: 
    
    1. Introduction	
    2. Terminology.	
    3. Risk Sources	
    4. Understanding the Risk	
    5. Risk Management Process Flow
    5.1	Identifying the Risk	
    5.2	Analyze Risk	
    5.3	Risk Planning	
    5.4	Risk Tracking	
    5.5	Risk Controlling	
    5.6	Retiring Risks	
    6. Status Strategy:	
    7. Process Dependencies	
    8. Process Summary	
    9. Approvals	
    10. Appendixes	
    10.1 Appendix A 'Top 10 List template'.	
    
    Terminology.
    
    -	Risk is a possibility, not the certainty, of suffering a loss.
    -	The loss could be anything from diminished quality of an end product 
    to increased cost, missed deadlines, or project failure. Because the 
    risk is fundamental ingredient of opportunity, it is not inherently bad, 
    but it is inherent in every project. Successful teams deal with risk by 
    recognizing and minimizing uncertainty
    -	Risk Statement -- Capture the nature of the Risk
    -	Risk Probability -- Describe the likelihood that it occur
    -	Risk Severity -- Define the Impact of the Risk
    -	Risk Exposure -- Quantify the overall threat
    -	Mitigation Plans -- Describe the effort to prevent or minimize the risk
    -	Contingency Plans and Triggers -- Describe what to do if the risk 
    occurs and when to do it
    -	Risk Ownership -- Describe who is responsible for monitoring the risk
    -	Risk Status -- Name of the risk's stage.
    

     Software Load Test Plan Table of Contents 
    1.	Introduction	
    1.1	Purpose	
    1.2	Background	
    1.3	Scope	
    1.4	Project Identification	
    2.	Test Requirements	
    3.	Test Strategy	
    3.1	Test objective	
    3.2	Type of test and type of virtual user	
    3.3	Test Approaches and Strategies	
    3.3.1	System analysis	
    3.3.2	Define the detailed testing check list	
    3.3.3	Developing Test Scripts
    3.3.4	Creating test scenario	
    3.3.5	Monitoring Performance	
    3.3.6	Analyzing test result	
    3.4	Other Considerations	
    4.	Resources	
    4.1	Workers	
    4.2	System	
    5.	Project Milestones	
    6.	Deliverables	
    6.1	Test Configuration	
    6.2	Test Logs	
    6.3	Test Reports	
    

    See Project Evaluation checklist

    If you need more information about Documentation for Software testing please search stickyminds.com the most comprehensive online resource for helping you produce better software.


    Bibliography:
    829-IEEE Standard for Software Test Documentation
    730 IEEE Standard for Software Quality Assurance Plans MIL-STD-498, Software Development and Documentation
    What is MIL-STD-498?
    MIL-STD-498 is the DoD's software development standard.
    It was developed with four primary objectives:
  • Merge DOD-STD-2167A, used for weapon systems, with DOD-STD-7935A,used for automated information systems, creating a single software development standard for DoD.
  • Resolve issues raised in the use of these standards
  • Ensure compatibility with current DoD directives, instructions, and other standards
  • Provide a basis for U.S. implementation of ISO/IEC 12207, Software Life Cycle Processes
    The MIL-STD-498 package consists of the standard and 22 Data Item Descriptions (DIDs).
    You can Download now a free copy.
    More samples of software testing documentation coming soon


    Extreme Software Testing Main Page
    © 2005 Alex Samurin http://www.geocities.com/xtremetesting/ and © eXtremeSoftwareTesting.com