I have run a test to one of our search functions using 1 user then had the following results
# Samples 147
Std. Dev. 418.3129775
Error % 0
Avg. Bytes 14643.13605
then using 25 users and i have this result:
# Samples 3115
Std. Dev. 3528.651571
Error % 9.63E-04
Avg. Bytes 13545.9634
Actually i can't analyze these results. first this i was expecting when running the 25 users to have 25*147 = 3675. but you can see that i have 3115. is there a logical reason that makes this occurs rather than an internal error or anything ?
also throughput. the variation between 3.643212967 and 18.89711235. how can i calculate the percentage of of the effect of 25 users on the throughput and how can i deduce the performance indicator whether it is good or bad.
finally what you can see from this results? is it good? bad?
a note: i was working good on the application with no noticeable drawbacks in the performance on the site.