Rational Functional Tester: Reviewing Automated Tests
Posted by Matt Archer on October 28, 2008
This post is part 4 of a series of 5 posts about test automation with Rational Functional Tester. I originally created the material as a white paper in 2004 when I was coaching a project that was new to GUI-level automation and also to RFT. I recently discovered it during a clean up and whilst some on my ideas have changed over the past 4 years, I’m happy that most of it still holds up as good advice. Maybe one day I’ll update it.
Review New Scripts
At the end of each automated development iteration the test team must verify the quality of the automated test scripts that they have produced. The formality and thoroughness of this activity will depend on the scale and critically of the automated testing solution being produced. Typically, the larger and more important the automated testing solution is, the greater the effort that should be applied to this activity. When a test team is new to automated testing the most effective way of reviewing automated test scripts is to perform peer-reviews with the support of automated test script checklists. A checklist is a simple list of questions specifically chosen to help the reviewer think about different aspects of an automated test script and whether it is conforms to the agreed approach to automated test script development.
Analyse Change & Issue Log
At the end of each automated development iteration the test team (specifically the common-code owner) must review the outstanding changes and issues associated with the automated testing solution. After removing any duplicates, the changes and issues should be analysed in terms of their importance to the automated testing solution and the amount of effort required making the change or investigate and fix the issue. Any changes made to the automated testing solution should be completed before the next automated development iteration is scoped.
Refine Test Automation Architecture
As a result of making any changes that were agreed during the ‘Analyse Change & Issue Log’ activity it may be necessary to refine the test automation architecture document to include alterations made at the architectural level. This may include, for example, any new custom recognition or verification mechanisms identified to fix a test script issue.