Functional Testing General Process - What is it and do I need training?
I just started a new role doing functional testing for a POS system. I haven't really been given any training or mentor-ship. I've just been given a specification document and was asked to create test cases. My questions are:
1. What is the general process of creating test cases? We are using quality center, is there supposed to be requirements defined in there and then I create test cases to "prove" those requirements work?
2. Is training required in this role? Is there formal training for functional testing or test case development?
Any information about functional testing of POS or in general or about test case development would be good. Just sort of flying by the seat of my pants here!
Joe, i have come across a nice document which helps in writing test cases effectively, you can go through it at Test Cases Guidelines
You can get training in validation from any number of seminars. A Google search for software validation training brings up tons of sites. There are also many books on the subject. If you're in a regulated environment they expect people to have training and or experience in their roles and this training is documented. If you're not in a regulated environment, but you have a quality system in place, you'd need to check to see what it says is required for training requirements. I've worked in both the regulatory and non-regulatory world doing software validation, and in both cases they wanted someone who knew how to test, but it wasn't required by a regulatory body for the latter case.
In order to write tests you should have requirements that you're going to test to. Tests are written to prove that the requirements are met, so for each requirement you will have test inputs - the steps you're going to take to do the test, and test results - the expected result of those tests steps. Your actual result after doing the test step should equal your expected result. You not only test that what happens should happen, you want to test that something that shouldn't happen doesn't.
For example, if there's a requirement that you need to enter a valid username and password to enter the system, you'll have at least two tests: one to test what happens when you enter valid information, and one to test what happens when you enter invalid information.
Information that was provided by two is good but I would like to add few more things whenever you are creating test cases:
1. Understand the requirement in detail
2. Try to feel the requirement from end-user prospective.
3. Come to the end-user level and think how will they use the application.
4. Use Test case creation methodologies, i.e., Boundary Value Analysis, Equivalence Class partitioning method, etc...
5. Think of all alternate flows and possible input values.
I found a good blog that has a lot of info on testing that you might like; one article is at this link Top 5 Tips for Building Test Cases Safely
Thanks for your reply. What about for regression testing? How is that handled with requirements? For example if a new update to the system might potentially cause a regression, and I have to write a new test case to test this potential regression, does a requirement have to be written for this potential regression?
Originally Posted by maaquilino
Maybe one of these is true or both?:
a) If you have a possible regression but don't need to write a new test case, then just run the old test case for a regression test and no new requirement is needed.
b) If you have a possible regression and do need to write a new test case, then a new requirement would be written to map to this new test case.
Or perhaps B isn't a regression test anymore but instead it's a whole new requirement. I don't know.
Hopefully that made some sense. Thanks!
I guess my real question is: Does there have to be a requirement documented for every single test case?
For example there is some new items coming into our database that may effect some of our logging. So I will write a new test case to test this logging with the new items. So does there have to be a requirement written to say "The log should work with these new items" or should the old requirement be updated?
Or does there even have to be defined requirements? Right now I'm on a team that uses QC but they only write test cases and there are no requirements in QC, only the original specification document is used as requirements to help develop test cases.
You'll want to keep a master "living" requirements doc. Too often I see requirements doc ignored until a major refactor, then it becomes a whole archeological dig to find old requirements, then figuring out how newer requirements patch the old requirements, and all the side effects those creates. If you haven't done this before, it's basically adding Paralegal to your job responsibilities, not fun. This actually happens to me with almost every new job I've taken, I come in because the company is about to do some risky refactoring and want to bring in some experience, after coming in, I see my predecessor didn't keep a living requirements document, and the whole organization is confused to how the software should actually work.
Originally Posted by LoadRunner421
Problem about using TestCases only as documentation, is the original intent and reasoning behind the decision in the first place is lost. Where a PRD/Other high level requirements will contain some of the marketing and business drivers behind a feature or how something works the way it works. Where a test case only describes how the implementation works. It loses details of the business and use cases, why the feature even exists, and why it's important. This information is very critical when making decisions on which features to trim, what is acceptable risk when doing a refactor, what should be tested more, etc... Imagine you're doing a refactor, and you're ask to implement a bunch of features, but say a dozen or so features might of been workarounds for technology limitations of business conditions in the past. I remember working at a company that there was some convoluted workaround case for a single franchise that no one knew. There was a test case for it, but implementing it on the new platform would of been extra several days of work. But somehow it turned out that this franchise was no longer a customer. If there was a living requirements doc, we probably would of known to trim that exception out and simplify the logic well before the large refactor and saved several days it took us to realize this.
Originally Posted by nanivishnu
Great thanks so much for the reply. This is really good info. My team has requirements documents but it seems like they just build test cases off of the doc and don't capture any "living" requirements. I think we should be documenting requirements in QC and then building test cases off of those but it seems like this team takes their high level PRD, builds test cases, and that's it. Problem for me is like you said, I now don't know how the application is supposed to function except from reading test cases. Anyway, not sure what else to say but thanks!
Originally Posted by dlai