Lost time in Load Test
I am using QA load to test a Web Application sitting behind IIS. My test consists of 6 screens. Screens are handled by ASP and data is sent/retrieved from CICS.
QALoad shows a script execution time of 9.3 Sec for a single user. However my IIS log shows that all requests were serviced in 6.8 Sec.
I am loosing 2.5 Sec somewhere. I am running across a dedicated 100 Mb ethernet Lan and only 159k of data is shipped during the test.
When I increase the number of users to 20, the missing time jumps to 8.5Sec per user.
I am running QALoad on a 4 way 550Mhz IBM Server.
Does anyone know if there is an overhead associated with QALoad itself, and if so, is there a standard calculation to predict what this time would be.
Re: Lost time in Load Test
When looking at timings its worth looking at each checkpoint and adding them up individually to check the total time, I'm assuming that you have checkpoints around each send-receive.
Additionally, if using the www middleware, you should check if any requests were re-sent. Finally, when looking at the total transaction execution time, this time will include any sleep that has been included in your script, network latency at any point in your network (e.g. firewall), data I/O and user entered code and calculations. Hence if you are using a remote data source this may cause I/O wait time.
Other than that, i can't think of any specific reason why you should be having a problem.