| || |
Problem in performance testing of desktop app
I have a server on which I launch 40 Terminal sessions and these terminal session execute Test Execute at startup and start running the script for testing my Delphi desktop application which resides on the server.
I am successful in running 40 to 44 users successfully on the server and running the script in random mode (users run the script randomly as they are given a delay of random time). The CPU usage is nearly 2% to 5% when the all the users run the random delay timings for each so the CPU is quite free to be utilised whenever 1 or 2 users finish off their random delays and start running their script. Thus the speed at which the script are run is fast and the application responds well.
But this is not the scenario when the script is run in the same scenario with 45 to 58 users running on Terminal sessions. The CPU usage suddenly becomes 98 to 100% and does not come down to 2% to 5% as in the above scenario. It remains still at 98 to 100% even when all the users are executing the random delay function also. I don't know what is utilising the but what I have observed that at this time most is CPU is taken by "Hardware Interrupts". Now I am very much confused how do I decrease the use of CPU? How do I decrease the Hardware Interrupts?
Even though I have a 32GB RAM and 2.00 GHz of processor (9 processors) I am not able to handle the load of the server and emulate the scenario of running 100 users. [img]/images/graemlins/frown.gif[/img]
Re: Problem in performance testing of desktop app