Is Negative testing when dealing with complex datasets just going to add noise?
I'm wrestling a bit with this and could use some thoughts from the community.
I have a project where we're batch processing large data records (300+ fields per record) that we received from an external system.
We have a bunch of requirements that describe how this data is transformed and processed which is great and we can produce a bunch of positive test cases to confirm the system meets the requirements.
The problem I have is in dealing with negative testing.
A simple example is:
If the Date of Birth is greater than 01/01/2000 then
- do X
- do Y
So I can create positive test cases to test this boundary and each of the conditions - I have no problem with this.
And, I can create some negative test cases for example:
- the date is invalid
- the date is missing
The problem I have is that in theory the source system should never send us a NULL or invalid date. So creating data to test this and then testing it (and all the activity to rectify errors for things that should never happen) will all add time and effort to our project for a situation that should never happen.
So with 300+ fields there are an infinite number of possible negative test cases.
What are your thoughts on this?
I think you want to separate negative cases in terms those dictated by business rules, and those from form validation.
A form validation rule for example..
Name should not be over 50 characters. That's may be there because of a technical limitation imposed by some database field. Testing for things like this can be pushed lower into unit tests and be a lot faster and more effective. On a unit test level, you can use generator and fuzzer libraries for generating inputs. Most of the time these things won't break because the impact is contained in 1 module.
While an business rule example...
A person under the age of 18 should not be allowed to visit the adult section. That is a business rule with legal implications. In terms of scope of things that can break, something like that might be broken by different modules like say you have a identity verification module, that receives the input after form validation, and talks to a database to look up your info. That has a higher likely hood of breaking from integration issues.
Last edited by dlai; 10-02-2014 at 09:42 AM.
For field validation, could you make one row of data with every possible piece of data broken.
Look at the error in the application. Now gradually make the data positive and see that the system reacts correctly?
Tags for this Thread