| || |
Framework for BPT
I have been having thoughts about BPT lately. I can understand all the claims that HP is making about the tool such as Component driven approach, BAs can design their test cases, less maintenance required etc However there is one area which remains as a concern for me - Test Data Management.
I do understand that Test Data can be fed into this Components/BPTs through the QC interface directly however that is not something that you would want to do as the UI can be confusing, test data will not be persistent (you may have to over the test data of a BPT in case executing for a different situation/combination of test data) and thus you may want to store test data on a excel file and let the Components pick data up from these files. Companies such as TurnKey do this by creating Test Data files(xls) which resemble the structure and parameters of a BPT.
With this another problem creeps in Maintenance.
Would give a simple Example - If I have to add a Parameter to a particular Component in a BPT I can add it at the appropriate place in QC and without worrying I would be assured that this change would be updated in all the BPTs which call the component.
Now with the external Test data file coming into picture I may need to go to the different xls files containing the details of the different BPTs which have a call to this particular component which would not be less than a headache and would increase the maintenance effort.
So my basic question is that - is there any approach out there with BPT which meets both the requirement
1) Have Test Data stored in a medium which is easier to be filled
2) Maintenance effort is less
Re: Framework for BPT
We rewrote the out of the box keywords to parse their inputs which allowed us to calculate input data at runtime to generate relative dates, random strings, random numbers, etc, or call into a database to retrieve data.
Components were parametrised for maximum flexibility and the parameter values either set appropriately in test plan or made runtime parameters and set in test lab. We tried to minimise the number of runtime parameters to simplify the process of data set-up as much as possible for the execution team. To further minimise data set-up where practical we included data generation in the script.