Roles and Activities > Developer Role Set > Test Designer > Define Test Approach
Activity:
|
Workflow Details: |
Purpose: | To consider the influence of the mission, test motivators and the test items on the approach for the forthcoming test effort. |
Using the evaluation mission as context, examine the iteration Test Plan and study the test motivators that have been identified for the forthcoming test effort. It may be necessary to do further investigation at the Motivator source - usually the iteration plan provides a means of locating additional information.
For each Motivator, consider what test approach and associated techniques might be required to address each Motivator. Also examine the iteration Test Plan and study the test items. Each targeted test item should be considered in relation to each Motivator, and the approach and techniques extended accordingly. If you cannot find a lot of detail about, or you are unfamiliar with the test items, it may be useful to discuss the targeted items with the development staff, usually by starting with the software architect or development team leads.
Focus on identifying the minimal set of techniques necessary to satisfactorily address the evaluation mission and motivators. Look for opportunities where one technique can be used to address more than one aspect of the required testing. Note other potential techniques that seem interesting to explore, but be able to identify these as additional rather than essential.
Purpose: | To consider the influence of the software architecture on the test approach. |
Study the Software Architecture to gain an understanding of it's key elementsmechanisms, main views and so forth. Typically the Software Architecture Document provides good information focused at the right level of detail for use in considering a test approach. To clarify it's information, or in the absence of a document, it is useful to discuss the architecture with the development staff, usually by talking to the software architect directly, or one of the development team leads.
Focus on identifying and discussing the key mechanisms, and gaining a good understanding of these aspects of the system. Each mechanism and key feature of the architecture will likely present challenges or constraints for the test effort. For example, a distributed architecture may necessitate organizing the test team into sub-teams, each team targeting an architectural tier. While creative techniques can often be used to overcome these challenges, it may be necessary to have the development team modify the software to enable testing as discussed in Activity: Define Testability Elements.
Purpose: | To consider the completeness of the test approach both in terms of breadth and depth. |
Considering all the details that are now known about the requirements on the test approach, it is beneficial to step back and consider the test approach from a higher-level perspective. What things does the test approach not address that it should? Are there any concerns that should be explored that don't appear in any of the documented information?
Based on your experience, review the requirements for the test approach for appropriate breadth and depth for this stage in the project lifecycle. Consider additional requirements that will help to present a more complete approach.
Purpose: | To reuse or adapt from existing proven test techniques, where appropriate. |
From your own experience, or other experience you have access to, identify existing techniques that will either meet the requirements of the test approach, or can be adapted to meet them.
Purpose: | To identify the techniques required to provide a comprehensive and sufficient test approach. |
It's not terribly useful to think in terms of a "complete" test approachthere are always additional techniques you might try if you only had limitless time and resource.
However, it is important that the test approach is well-rounded and comprehensive enough to allow a useful evaluation of perceived quality to be made. This requires an approach that evaluates sufficient aspects of quality risk or dimensions of quality for the project team to assess perceived quality with a justified degree of confidence.
Purpose: | To outline the workings of each technique, including the objective of the testing it supports. |
Outline the workings of each technique. Address the type of testing it supports, the objective and scope, implementation method, test oracles, assessment method and automation needs of the technique.
In many cases you'll reuse technique from one project to the next. In this situation you can simply reference a common definition of the technique or copy the existing definition and revise as appropriate.
Many techniques will support more than one type of testing, so give some thought to identifying which tests the technique will need to support. This helps to identify the scope of the effort required if the technique is being defined for the first time.
Give thought to the underlying objective and value this technique represents.
Define how the technique will be implemented. It's not good enough to simply state "We're doing system performance testing"you need to give serious thought to how that can be achieved.
Some techniques you would like to use will be uneconomic to pursue. By describing briefly how you will approach implementing this technique you'll be able to get a overall sense of the logistics involved and the practicalities of pursuing the technique further.
Determine how you will observe and evaluate the outcomes of each test implemented using this technique. Give thought to the different Test Oracles that are available for you to useis there a single oracle, or are their different ways that you can determine the outcome of each test?
Automation can play an important role in many test techniques. In some cases it will be less sophisticated, simply providing support for conducting manual tests.
Give some thought to how the work involving the technique could be most efficiently implemented, maintained and managed. Be open mindedthink both broad and deep, considering as many options as possible.
Identify the appropriate tools to use with this test technique. Use the work from the previous step that identified uses of automation.
Remember to consider a broad range of tool categories; your list of candidate
tools should include more than just test execution automation tools. In addition
tools that automate test execution, consider tools that will enhance the productivity
of the test team by reducing repetitive, laborious tasks, such as Test Data
management, Test Results analysis, incident and Change Request reporting tools,
etc.
Purpose: | To define a candidate architecture for the test automation system. |
Based on experience gained from similar systems or in similar problem domains, begin to define a candidate architecture for the test automation system.
We recommend you review the information at the following link to help you with this task: Workflow Detail: Define a Candidate Architecture.
Purpose: | To consider what requirements test will have for configuration management. |
Like many other artifacts produced during a software development project, test assets are candidates for configuration management and version control.
The specific requirements can range in complexity from the decision to use basic backup and recovery services enabled, to having full-support for parallel development of automated Test Scripts at multiple sites against different versions of an application.
Give thought to your requirements for configuration management, and begin to outline probable logistic needs to realize those requirements.
Purpose: | To reduce risk and effort by reusing existing proven assets. |
Sometimes it makes sense to build assets from scratch, and sometimes it doesn't. Try to find a good balance between a complete "roll-you-own" philosophy and establishing a rigid and bureaucratic librarian policy on new artifact creation.
There are times when one approach is better than the other, and you should be flexible enough to take advantage of the benefits that both approaches bring.
Purpose: | To record the important information about the test approach. |
Depending on a number of factors including team size and organization culture, there will be better and worse ways to record the decisions you've made about the test approach.
You will typically have two audiences to consider: the management team will want to review this information to provide approval and be aware of logistics implications of the approach, and the test team will want to use the test approach as guidance for the work the undertake. Try to find an appropriate medium to suitably address both needs: perhaps using a project Intranet web-site.
Purpose: | To verify that the activity has been completed appropriately and that the resulting artifacts are acceptable. |
Now that you have completed the work, it is beneficial to verify that the work was of sufficient value, and that you did not simply consume vast quantities of paper. You should evaluate whether your work is of appropriate quality, and that it is complete enough to be useful to those team members who will make subsequent use of it as input to their work. Where possible, use the checklists provided in RUP to verify that quality and completeness are "good enough".
Have the people performing the downstream activities that rely on your work as input take part in reviewing your interim work. Do this while you still have time available to take action to address their concerns. You should also evaluate your work against the key input artifacts to make sure you have represented them accurately and sufficiently. It may be useful to have the author of the input artifact review your work on this basis.
Try to remember that that RUP is an iterative process and that in many cases artifacts evolve over time. As such, it is not usually necessaryand is often counterproductiveto fully-form an artifact that will only be partially used or will not be used at all in immediately subsequent work. This is because there is a high probability that the situation surrounding the artifact will changeand the assumptions made when the artifact was created proven incorrectbefore the artifact is used, resulting in wasted effort and costly rework. Also avoid the trap of spending too many cycles on presentation to the detriment of content value. In project environments where presentation has importance and economic value as a project deliverable, you might want to consider using an administrative resource to perform presentation tasks.
Rational Unified Process |