Need to know how to test the output in database as stated below
Hi... i need to know a way of testing or technique for the below scenerio.
Input.txt ---> Processing module --> Output in database tables
Here is how my input.txt(It will be a huge file say 10k lines) looks like
I feed this file to processing module and the output will be entered into datbase tables in certain form (eg:- table1 to table10 & each table contains more than 6 columns).
eg:- table 1
So like this i will get the output in db tables.
My question is how to test the output. as input is in the form of packets in .txt file and processed data is in huge rows of database
Here is What i have tried..
1) Process the input normally (now processed data will be in DB tables)
2) Export all the processed data tables to .csv files.
3) validate these .csv files manually (Only first time).
4) Keep these .csv files as standard reference files(STD).
5) For every release run the process --> export the o/p data from tables to .csv files(Actual).
6) then compare these(Actual) .csv files with already stored .csv files(STD)
If you have any other way of testing please suggest.
Originally Posted by Prasanna_123
RedGate makes some good Database Unit testing products.
Another way you can also test databases is use migration and scripts to build a clean database, then populated with some baseline test data, then have unit tests wrap and execute your DB stored procedures and verify the output.
Tags for this Thread