SPONSORS:






User Tag List

Thanks Thanks:  0
Likes Likes:  0
Dislikes Dislikes:  0
Results 1 to 2 of 2
  1. #1
    New Member
    Join Date
    Feb 2012
    Posts
    2
    Post Thanks / Like
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Total Downloaded
    0

    Need to know how to test the output in database as stated below

    Hi... i need to know a way of testing or technique for the below scenerio.

    Input.txt ---> Processing module --> Output in database tables

    Here is how my input.txt(It will be a huge file say 10k lines) looks like
    timestamp1,organisation1,data1,localtimestamp1,pla ce1
    timestamp2,organisation1,data2,localtimestamp2,pla ce1
    timestamp3,organisation1,data3,localtimestamp3,pla ce1

    I feed this file to processing module and the output will be entered into datbase tables in certain form (eg:- table1 to table10 & each table contains more than 6 columns).

    eg:- table 1
    column1 column2.......
    Processeddata1 place1
    Processeddata2 place2

    So like this i will get the output in db tables.

    My question is how to test the output. as input is in the form of packets in .txt file and processed data is in huge rows of database

    Here is What i have tried..
    1) Process the input normally (now processed data will be in DB tables)
    2) Export all the processed data tables to .csv files.
    3) validate these .csv files manually (Only first time).
    4) Keep these .csv files as standard reference files(STD).
    5) For every release run the process --> export the o/p data from tables to .csv files(Actual).
    6) then compare these(Actual) .csv files with already stored .csv files(STD)

    If you have any other way of testing please suggest.
    Please help.

  2. #2
    SQA Knight
    Join Date
    May 2006
    Location
    Playa Del Rey, California, United States
    Posts
    2,592
    Post Thanks / Like
    Mentioned
    17 Post(s)
    Tagged
    1 Thread(s)
    Total Downloaded
    0
    Quote Originally Posted by Prasanna_123 View Post
    Hi... i need to know a way of testing or technique for the below scenerio.

    Input.txt ---> Processing module --> Output in database tables

    Here is how my input.txt(It will be a huge file say 10k lines) looks like
    timestamp1,organisation1,data1,localtimestamp1,pla ce1
    timestamp2,organisation1,data2,localtimestamp2,pla ce1
    timestamp3,organisation1,data3,localtimestamp3,pla ce1

    I feed this file to processing module and the output will be entered into datbase tables in certain form (eg:- table1 to table10 & each table contains more than 6 columns).

    eg:- table 1
    column1 column2.......
    Processeddata1 place1
    Processeddata2 place2

    So like this i will get the output in db tables.

    My question is how to test the output. as input is in the form of packets in .txt file and processed data is in huge rows of database

    Here is What i have tried..
    1) Process the input normally (now processed data will be in DB tables)
    2) Export all the processed data tables to .csv files.
    3) validate these .csv files manually (Only first time).
    4) Keep these .csv files as standard reference files(STD).
    5) For every release run the process --> export the o/p data from tables to .csv files(Actual).
    6) then compare these(Actual) .csv files with already stored .csv files(STD)

    If you have any other way of testing please suggest.
    Please help.

    RedGate makes some good Database Unit testing products.

    Another way you can also test databases is use migration and scripts to build a clean database, then populated with some baseline test data, then have unit tests wrap and execute your DB stored procedures and verify the output.
    David Lai
    SDET / Consultant
    LinkedIn profile

 

 

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Search Engine Optimisation provided by DragonByte SEO v2.0.36 (Pro) - vBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.
Resources saved on this page: MySQL 12.00%
vBulletin Optimisation provided by vB Optimise v2.6.4 (Pro) - vBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.
User Alert System provided by Advanced User Tagging v3.2.8 (Pro) - vBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.
vBNominate (Lite) - vBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.
Feedback Buttons provided by Advanced Post Thanks / Like (Pro) - vBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.
Username Changing provided by Username Change (Free) - vBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.
BetaSoft Inc.
Digital Point modules: Sphinx-based search
All times are GMT -8. The time now is 04:14 PM.

Copyright BetaSoft Inc.