Collegiate Sports Paging System

Test Plan

Version 1.0

 

Revision History

Date

Version

Description

Author

October 26, 1999 1.0 Initial version Context Integration
Table of Contents

Introduction Top of page

Purpose

This Test Plan document for the Collegiate Sports Paging System supports the following objectives:

  1. Identify existing project information and the software components that should be tested
  2. List the recommended Requirements for Test (high level)
  3. Recommend and describe the testing strategies to be employed
  4. Identify the required resources and provide an estimate of the test efforts
  5. List the deliverable elements of the test project

Background

The Collegiate Sports Paging System provides alphanumeric paging to subscribers when events occur within collegiate sports categories to which they subscribe. Subscribers can then connect to a personalized web site where they can view the stories for which they were paged, as well as other collegiate sports news.

The system is comprised of 3 major subsystems contained on an Application Web Server and interacts with the existing WebNewsOnLine web site as well as paging gateways. The subsystems include:

The system architecture can be depicted as follows:

 

Scope

The Collegiate Sports Paging System will be unit tested and system tested. Unit tests will address functional quality, while system testing will address issues of scalability and performance.

The interaction of the subsystems will be tested as follows:

    1. Content Management to Paging
    2. Content Management to Reporting

The following systems interfaces will be tested:

    1. Collegiate Sports Paging System to existing WebNewsOnLine Web Server
    2. Collegiate Sports Paging System to paging gateways

The most critical testing will be that of load and performance testing. This will be addressed as follows:

  1. We will create a test scenario that will generate increasing numbers of pages up to 200,000.
  2. We will also create a test scenario that has new content arriving at the system at the rate of one item every 20 seconds.
  3. Lastly, we will simulate increasing concurrent subscriber loads up to 200,000.

Project Identification

The table below identifies the documentation and availability, used for developing the test plan:

Document
(and version / date)
Created or Available Received or Reviewed Author or Resource Notes
Vision Document n Yes o No n Yes o No Context Integration  
Supplemental Specification n Yes o No n Yes o No Context Integration  
Use Case Reports n Yes o No n Yes o No Context Integration  
Project Plan n Yes o No n Yes o No Context Integration  
Design Specifications o Yes n No o Yes o No    
Prototype n Yes o No n Yes o No Context Integration  
Project / Business Risk Assessment n Yes o No n Yes o No Context Integration  

Requirements for Test Top of page

The listing below identifies those items (use cases, functional requirements, non-functional requirements) that have been identified as targets for testing. This list represents what will be tested.

Database Testing

    Verify that subscriber information can be entered and retrieved.

    Verify that content and categories can be inserted and displayed.

    Verify that advertiser profiles and account information can be entered and displayed.

    Verify that subscriber-specific usage information is tracked.

Functional Testing

    Verify that subscribers see the information for which they have requested paging.

    Verify that pages go to subscribers when content arrives.

    Verify that automatic content insertion works.

    Verify that editor approval causes non-automatic content to be inserted.

    Verify that subscribers who have lapsed subscriptions do not receive pages.

    Verify that content marked as archived is not re-displayed to subscribers.

    Verify that obsolete content is deleted.

    Verify that advertiser reports are accurate.

    Verify that advertiser reports can be received in Word, Excel, or HTML.

Business Cycle Testing

    None.

User Interface Testing

    Navigate through all use cases, verifying that each UI panel can be easily understood

    Verify all online Help functions

    Verify that all screens conform to the WebNewsOnLine standards.

Performance Profiling

    Verify response time of interface to Pager Gateway system.

    Verify response time of interface from existing WebNewsOnLine web server.

    Verify response time when connected using 56Kbps modem.

    Verify response time when connected locally (on the same LAN).

Load Testing

    Verify system response with 200 concurrent subscribers.

    Verify system response with 500 concurrent subscribers.

    Verify system response with 1,000 concurrent subscribers.

    Verify system response with 5,000 concurrent subscribers.

    Verify system response with 10,000 concurrent subscribers.

    Verify system response with 50,000 concurrent subscribers.

    Verify system response with 100,000 concurrent subscribers.

    Verify system response with 200,000 concurrent subscribers.

Stress Testing

    None.

Volume Testing

    Verify pages sent out within 5 minutes when single content element arrives.

    Verify pages sent out within 5 minutes when content arrives every 20 seconds.

Security and Access Control Testing

    Verify that non-subscribers cannot access subscriber-only information.

    Verify that non-editors can not approve content.

    Verify that advertisers see only their own advertising content.

Failover/Recovery Testing

    None.

Configuration Testing

    Verify operation using Netscape V4.x browser.

    Verify operation using Microsoft Internet Explorer V5.x

Installation Testing

    None.

Test Strategy Top of page

Testing Types

Data and Database Integrity Testing
Test Objective: Ensure database access methods and processes function properly and without data corruption.
Technique:
  • Invoke each database access method and process, seeding each with valid and invalid data (or requests for data).
  • Inspect the database to ensure the data has been populated as intended, all database events occurred properly, or review the returned data to ensure that the correct data was retrieved (for the correct reasons)
Completion Criteria: All database access methods and processes function as designed and without any data corruption.
Special Considerations:
  • Processes should be invoked manually.
  • Small or minimally sized databases (limited number of records) should be used to increase the visibility of any non-acceptable events.
Function Testing
Test Objective: Ensure proper target-of-test functionality, including navigation, data entry, processing, and retrieval.
Technique: Execute each use case, use case flow, or function, using valid and invalid data, to verify the following:
  • The expected results occur when valid data is used.
  • The appropriate error / warning messages are displayed when invalid data is used.
  • Each business rule is properly applied.
Completion Criteria:
  • All planned tests have been executed.
  • All identified defects have been addressed.
Special Considerations: None.
User Interface Testing
Test Objective: Verify the following:
  • Navigation through the target-of-test properly reflects business functions and requirements, including window to window, field to field, and use of access methods (tab keys, mouse movements, accelerator keys)
  • Web objects and characteristics, such as menus, size, position, state, and focus conform to standards.
Technique: Create / modify tests for each window to verify proper navigation and object states for each application window and objects.
Completion Criteria: Each window successfully verified to remain consistent with benchmark version or within acceptable standard
Special Considerations: Not all properties for custom and third party objects can be accessed.
Performance Profiling
Test Objective: Verify performance behaviors for designated transactions or business functions under the following conditions:
  • normal anticipated workload
  • anticipated worse case workload
Technique: Use Test Procedures developed for Function or Business Cycle Testing.

Modify data files (to increase the number of transactions) or the scripts to increase the number of iterations each transaction occurs.

Scripts should be run on one machine (best case to benchmark single user, single transaction) and be repeated with multiple clients (virtual or actual, see special considerations below).

Completion Criteria: Single Transaction / single user: Successful completion of the test scripts without any failures and within the expected / required time allocation (per transaction)

Multiple transactions / multiple users: Successful completion of the test scripts without any failures and within acceptable time allocation.

Special Considerations: Comprehensive performance testing includes having a "background" workload on the server.

There are several methods that can be used to perform this, including:

  • "Drive transactions" directly to the server, usually in the form of SQL calls.
  • Create "virtual" user load to simulate many (usually several hundred) clients. Remote Terminal Emulation tools are used to accomplish this load. This technique can also be used to load the network with "traffic."
  • Use multiple physical clients, each running test scripts to place a load on the system.

Performance testing should be performed on a dedicated machine or at a dedicated time. This permits full control and accurate measurement.

The databases used for Performance testing should be either actual size, or scaled equally.

Load Testing
Test Objective: Verify performance behaviours time for designated transactions or business cases under varying workload conditions.
Technique: Use tests developed for Function or Business Cycle Testing.

Modify data files (to increase the number of transactions) or the tests to increase the number of times each transaction occurs.

Completion Criteria: Multiple transactions / multiple users: Successful completion of the tests without any failures and within acceptable time allocation.
Special Considerations: Load testing should be performed on a dedicated machine or at a dedicated time. This permits full control and accurate measurement.

The databases used for load testing should be either actual size, or scaled equally.

Volume Testing
Test Objective: Verify that the target-of-test successfully functions under the following high volume scenarios:
  • maximum (actual or physically capable) number of clients connected (or simulated) all performing the same, worst case (performance) business function for an extended period.
  • maximum database size has been reached (actual or scaled) and multiple queries / report transactions are executed simultaneously.
Technique: Use tests developed for Performance Profiling or Load Testing.

Multiple clients should be used, either running the same tests or complementary tests to produce the worst case transaction volume / mix (see stress test above) for an extended period.

Maximum database size is created (actual, scaled, or filled with representative data) and multiple clients used to run queries / report transactions simultaneously for extended periods.

Completion Criteria: All planned tests have been executed and specified system limits are reached / exceeded without the software or software failing.
Special Considerations: What period of time would be considered an acceptable time for high volume conditions (as noted above)?
Security and Access Control Testing
Test Objective: Application-level Security: Verify that an actor can access only those functions / data for which their user type is provided permissions.

System-level Security: Verify that only those actors with access to the system and application(s) are permitted to access them.

Technique: Application-level: Identify and list each actor type and the functions / data each type has permissions for.

Create tests for each actor type and verify each permission by creating transactions specific to each user actor.

Modify user type and re-run tests for same users. In each case verify those additional functions / data are correctly available or denied.

System-level Access (see special considerations below)

Completion Criteria: For each known actor type, the appropriate function / data are available and all transactions function as expected and run in prior function tests
Special Considerations: Access to the system must be reviewed / discussed with the appropriate network or systems administrator. This testing may not be required as it maybe a function of network or systems administration.
Configuration Testing
Test Objective:

Verify that the target-of-test functions properly on the required hardware / software configurations.

Technique:

Use Function Test scripts

Open / close various non-target-of-test related software, such as the Microsoft applications, Excel and Word, either as part of the test or prior to the start of the test.

Execute selected transactions to simulate actor’s interacting with the target-of-test and the non-target-of-test software

Repeat the above process, minimizing the available conventional memory on the client.

Completion Criteria:

For each combination of the target-of-test and non-target-of-test software, all transactions are successfully completed without failure.

Special Considerations:

What non-target-of-test software is needed, is available, accessible on the desktop?

What applications are typically used?

What data are the applications running (i.e. large spreadsheet opened in Excel, 100 page document in Word).

The entire systems, netware, network servers, databases, etc. should also be documented as part of this test.

Tools

The following tools will be employed for this project:

 

Tool

Version

Defect Tracking

Project HomePage

 
Project Management

Microsoft Project

 


Resources Top of page

This section presents the recommended resources for the Collegiate Sports Paging System test effort, their main responsibilities, and their knowledge or skill set.

Workers

This table shows the staffing assumptions for the project.

Human Resources
Worker Minimum Resources Recommended Specific Responsibilities/Comments
Test Manager / Test Project Manager 1 ( Collegiate Sports Paging System project manager) Provides management oversight

Responsibilities:

  • Provide technical direction
  • Acquire appropriate resources
  • Management reporting
Test Designer 1 Identifies, prioritizes, and implements test cases

Responsibilities:

  • Generate test plan
  • Generate test model
  • Evaluate effectiveness of test effort
Tester 4 (provided by WebNewsOnLine) Executes the tests

Responsibilities:

  • Execute tests
  • Log results
  • Recover from errors
  • Document change requests
Test System Administrator 1 Ensures test environment and assets are managed and maintained.

Responsibilities:

  • Administer test management system
  • Install / manage worker access to test systems
Database Administration / Database Manager 1 (provided by WebNewsOnLine) Ensures test data (database) environment and assets are managed and maintained.

Responsibilities:

  • Administer test data (database)
Designer 2 Identifies and defines the operations, attributes, and associations of the test classes

Responsibilities:

  • Identifies and defines the test class(es)
  • Identifies and defines the test packages
Implementer 4 Implements and unit tests the test classes and test packages

Responsibilities:

  • Creates the test classes and packages implemented in the test model.

System

The following table sets forth the system resources for the testing project.

The specific elements of the test system are not fully known at this time. It is recommended that the system simulate the production environment, scaling down the accesses and database sizes if / where appropriate.

System Resources
Resource Name / Type
Database Server  
—Network/Subnet TBD
—Server Name TBD
—Database Name TBD
Client Test PC's  
—Include special configuration
—requirements
TBD
Test Repository  
—Network/Subnet TBD
—Server Name TBD
Test Development PC's TBD

Project Milestones Top of page

  Milestone Task   Effort Start Date End Date
  Plan Test        
  Design Test        
  Implement Test        
  Execute Test        
  Evaluate Test        

Deliverables Top of page

Test Model

For each test executed, a test result form will be created. This shall include the name or ID of the test, the use case or supplemental specification to which the test relates, the date of the test, the ID of the tester, required pre-test conditions, and results of the test.

Test Logs

p>Microsoft Word will be used to record and report test results.

Defect Reports

Defects will be recorded using the Project HomePage via the Web.

Appendix A: Project Tasks Top of page

Below are the test related tasks:

Plan Test
Identify Requirements for Test
Assess Risk
Develop Test Strategy
Identify Test Resources
Create Schedule
Generate Test Plan
Design Test
Workload Analysis
Identify and Describe Test Cases
Identify and Structure Test Procedures
Review and Access Test Coverage
Implement Test
Record or Program Test Scripts
Identify Test-Specific functionality in the design and implementation model
Establish External Data sets
Execute Test
Execute Test Procedures
Evaluate Execution of Test
Recover from Halted Test
Verify the results
Investigate Unexpected Results
Log Defects
Evaluate Test
Evaluate Test-Case Coverage
Evaluate Code Coverage
Analyze Defects
Determine if Test Completion Criteria and Success Criteria have been achieved

 

 

Copyright  © 1987 - 2001 Rational Software Corporation

Display Rational Unified Process using frames

Rational Unified Process