We are trying to come up with different Statuses that could be used to report the status of an Automated Test in Quality Center. Objective is to use more meaningful statuses.
For example when automation test fails but after triage, we find that it's actually not a functional issue and but it failed due to a problem with the script we could use Passed with automation error.
Any inputs on the possible Test Statuses used in QC is highly appreciated.
It is not recommended to change the existing, pre-defined test statuses because they can be used by QC in many ways: update requirement coverage, send mails, to name a few.
Depending on your reporting needs and technics, you could add a user defined field (UDF) to the run and/or the test instance entities to store an additional status. Then manually change the actual status to "Passed" and use the UDF(s) to mark the test instance and/or the run as "automation failure".
We had a similar request, although for "No Run" tests: we needed to distinguish between tests that could not be run due to the test environment from tests that have not yet been run: we added a UDF to the test instance entity and users could mark those tests that they could not run because of the environment.
Just for the "No Run" tests, we have something like:
- Function not available (was not delivered in that build)
- Test Data not available (somes test sets are run against a copy of production data : some test cases may be absent, or, the data needed for that test was supposed to be created by a previous test, but it failed)
- Flow not available (some tests require a data flow to be exchanged between two or more "applications" : not all test environments are set up with all applications and configured for these application to communicate)
- Abandonned (not enough time and not important)
- Pre-requisite missing (apart from function, data and flow)
Now, if you're not sure what items to put in a list, you can set up the list with minimal contents and no verification : users will be able to enter new values. Then, you monitor the values that they enter and, after a while, you can decide which items needs to be in the list, update the list items and mark the field as Verify. You may need to revisit the data so as to make it consistent with the new list contents.