CWS Checklist for QA responsibilities
It is very difficult to identify all Activities which are needed to approve a CWS in a high quality. So it is very easy to oversee small items, which can be followed easily by a Regression in the Master (MWS) builds. To avoid such kind of Regressions this list of Action Items should help.
- 1 Step 1 : Check common CWS requirement (Checklist in flowchart of 'CWS Workflow for QA-Representative)'
- 2 Step 2 : Check general CWS health (Avoid wasted effort of other stake-holders)
- 3 Step 3 : Inform issue owners and wait for feedback
- 4 Step 4 : Collect and evaluate Feedback
- 5 Step 5 : Nominate or reject CWS
- 6 Further questions
- 7 Helpful Links
Step 1 : Check common CWS requirement (Checklist in flowchart of 'CWS Workflow for QA-Representative)'
Check if release target of CWS is set correctly.
Check right adjustment of due dates (plan at least an average of 5 working days for testing in QA).
Build version should not be too old compared to the current master.
(e.g. 1-2 build if many CWSs were integrated; 3-5 build if only a few CWSs were integrated - release phase -; last build if important changes were made in this build)
Appropriate information about specific and global risks must be provided, either in the comments section of the EIS status page and/or in one or more QA child tasks.
All issues must have the status 'Resolved/Fixed'
All issues must be assigned to a QA members (except those only developers can verify)
Issues that can only be tested by a developer must have the status 'verified' (set by another developer!)
For all new features a FINAL specification must be existent
For all features and changes a Change-Mail must be existent (see http://www.openoffice.org/servlets/SummarizeList?listName=allfeatures
Issues submitted by developers must provide a test case to ensure appropriate verification.
All required install sets need to be existent (at least one Windows and one Unix/Linux build, whereas both must be a product build).
Step 2 : Check general CWS health (Avoid wasted effort of other stake-holders)
Run required automated test on both product build - Windows and Unix/Linux – (first.bas, topten.bas from framework/first)
Run additional testing on at least one platform (if necessary)
Run additional tests for specific testing areas or Update-Test for specific applications
Evaluate test results against Master Workspace (MWS)
Step 3 : Inform issue owners and wait for feedback
Step 4 : Collect and evaluate Feedback
Inform about obstacles in the comments section of the EIS status page
Step 5 : Nominate or reject CWS
If the CWS must be rejected a comment with the reason for the rejection must be inserted in EIS.
Questions should be asked and discussed in firstname.lastname@example.org.
EIS - automatic guest login - : http://eis.services.openoffice.org/EIS2/GuestLogon
EIS - general user login - : http://eis.services.openoffice.org/EIS2/Logon
Specification process : http://wiki.services.openoffice.org/wiki/Category:Specification