Automation
Automation
Automation
This checklist is tailorable to meet the needs and expectations unique to each project and used to complete a quality
review of a requirements work product. The problems, errors, or non-conformance items identified should be documented
on a Peer Review Feedback Form. Please reference the Quality Management Plan for additional information on the Peer
Review Process.
1.0 Documentation
1.1 Typos - Typographical errors
1.1.1 Did the document successfully pass spelling & grammar review?
- [Please enter other criteria specific to your project]
Form and Format - Incorrect or inconsistent paragraph titling, indentation, list identifiers, type style, writing style, and
1.2 location of information are among the form and format problems covered by this type.
Does the cover page or document control log have the correct
1.2.1 information?
1.2.2 Is the version number properly utilized (e.g., Draft = Release_Draft #)?
Do the drafts have the version number correctly incremented from the last
1.2.3 document?
Do the dates on the document match (e.g., cover page and
1.2.4 header/footers)?
Does the document show where it is maintained (documents repository
1.2.5 and versioned)?
1.2.6 Was the correct template used to create the document?
Does the document contain all the information required by the template,
1.2.7 with any omitted or added sections approved beforehand?
1.2.8 Is the Table of Contents accurate and correctly formatted?
1.2.9 Are header/footers consistent throughout the document?
1.2.10 Is the document consistent in style and tone?
- [Please enter other criteria specific to your project]
Content - This type is the first type in which the conformance crosses over into the subjective realm. Obvious omissions
of information are entered. However, less obvious is the lack of sufficient detail to make the document understandable and
1.3 usable. This type is a common problem type for which the solution is not simple.
Requirements - are what functions to expect a system to perform and at what level of performance the functions are
2.0 desired. Good requirements are specific, measurable and attainable within the time given.
Interface - Problems of this type encompass hardware to software, software to software, and man to machine (also
2.1 referred to as user system) interface requirements.
Do the requirements specify the level of difficulty it should take for a user
to complete a task (e.g., The user should be able to return to the home
2.3.4 page from any other page on the site)?
Are the requirements documented as individual testable items? Note:
2.3.5 Avoid long narrative paragraphs containing more than one requirement.
Does a requirements statement use “and/or” or “etc."? This suggests that
2.3.6 several requirements have been combined into one.
Does each requirement contain a subject and predicate where the subject
is a user type or the system under discussion, and the predicate is a
2.3.7 condition, action or intended result?
2.3.8 Does the whole requirement specify a desired end goal or result?
Does each requirement form a complete sentence (not a bullet list of buzz
2.3.9 words, lists of acronyms, or sound bites on a slide)?
Automation Checklist
Automation Project Checklist
1.1.1 Automation Tool is available with valid license for all Team members
Project Folder is created within the repository for the application being
1.2.1 automated
Approval for loading the scripts in the repository has been received by
1.2.3 Team Lead or resource designated by the Team Lead
1.2.4 QTP and/or other add-ins are loaded in Quality Center or repository
Has the manual time for executing the Regression suite been
1.3.7 documented? (Data prep, navigation sequence is consistent)
1.4.1 Has time sheet tasks been added for all resources?
1.5.3 Does the automation team have sufficient time to create the scripts?
If test data is not available for the application, has the automation team
requested the name of the contact person from which the data can be
1.5.4 requested?
Has schedules for maintenance, installs, etc. been requested from the
application team? Planned application down time should be
communicated to every team member and saved on the shared drive
1.5.9 within the application folder.
Does automation team have contact information for the application's defect
1.5.10 resolution team?
1.5.12 Is the database accessible from the location where scripts are created?
Has automation resource followed the correct naming conventions for the
1.5.13 test scripts?
Has automation resource stored the automated test scripts in the correct
1.5.14 path?
Are scripts stored in Quality Center and has the QC Administrator been
1.5.15 informed?
2.1.4 Has the contact persons been identified to resolve build instability issues?
Has automation team shared information on how the scripts are created,
3.1 the overall framework of the scripts and how the scripts are maintained?
As part of Hand-off, two documents have been delivered:
1) Documents of Automation scripts
2) Job aid to run the scripts
3.2
Has training been conducted to ensure the application team is able to
maintain the automated test scripts?
3.3