| || |
Linking data-driven cases to Requirements
We are plodding forward in our first project using Mercury Quality Center in conjunction with QTP. We've pretty much finished up with importing all of the requirements into the project as well as a sizeable set of test cases and we are in the process of linking these test cases to the requirements.
It's well understood by us that we can easily link any of these test cases, be they manual or automated through QTP, to any of the requirements however we have an automated framework already established that will handle our field level validation and other types of tests that is driven from a database. Meaning, the test cases come from a table which we create a recordset object in the script to process all of the parameters and drive the scripts.
What we are wondering is, what would be the suggested strategy for linking test cases coming from our database to requirements in QC, furthermore report their results (i.e. coverage) into QC's database? In a similar vein, how would one suggest that test cases from our database be brought into the Test Lab since they don't exactly exist as "scripts" in the same physical way that QC expects them to be.
My first inclination is that there must be some API that we can hook into that we can code in the driver script as it loops through the test cases and inject data into QC via the API.
Another way, probably more time consuming, is to have a manual test case for each of the test cases represented in our database and simply take the automated results and manually bring those cases into Test Lab and touch each one to indicate what the automated results were. Not ideal I would think.
Have any QC folks out there been presented with a similar situation and if so, in a general sense, how was it handled?