I 've conducted load test and observed that for 1 particular transaction the avg response time is 11 sec and 90th percentile is 7.5 secs. Can the avg response time will be higher that the 90th percentile?. Also checked the raw data for this particular transaction and it has used all the data points.
Also found that the std.deviation is also high (26.3). Is the timings which the load runner populated is correct or is there any chances of missing of data points in calculating the timings. Suggest me how can I drill down in order to check the timings are correct?
2. Will always the avg response time will be LOWER than the 90th percentile?
Picture 100 responses all of 1 second. Average = 1 sec, 90th percentile = 1 sec.
Now picture 100 responses all of 1 second EXCEPT one that is 60 seconds.
In this case the Average would be 1.59 secs, 90th percentile = 1 sec (because 90 out of the 100 still complete within 1 second).
So, no, the average response time does not always have to be lower than the 90th pecentile.
Average response times are the average of ALL response times measure INCLUDING any outlier values.
The easiest way to think of Percentiles in LR terms is that they discount the top x% of the outlier values (where x is 100-percentile value; in this case 100%-90% = 10%: so return the highest response time measured ignoring the highest 10%)