Help with Downloading problem
This is not exactly a straight forward performance test where we test varying numbers of users on 1 client, but a case where there are multiple clients .
We have about 50 clients workstations. All start up within 15 minutes every morning for business. During the start up, each workstation uploading a common package from a central point. This msi is a pre-requisite for a scheduled native application (APP) to run within minutes after workstation starts, else application launce once and fails. (This configuration is what the customer wants, and have their reasons for doing so)
--When only 1 workstation reboots, it downloads the msi within seconds.
--When all 50 workstation restarts within 15 minutes, it takes forever to download the msi and the application that depends on it fails.
Without incurring cost of new infrastructures:
--How this problem can be solved? We need to significantly reduce this download time.
--How can I replicated large number of workstations (or some kind of consumer to push the msi) and test the time? Any idea how to test to research this problem? Any tools?
--What software/windows script tool to use for varying download/upload ? (this is windows env)
I am thinking of vm machines, but that also involve a significant cost?
Looking forward, thanks.
Step one is to determine how the MSI file is being "downloaded" to the client. My hypothesis is that it is using the standard file mechanism of the network file system in place (Novell, Microsoft, etc....) and it is all having to be pulled from a remote location where a congested network is in play
Once you understand what the transport is for the download of the file the build of the test is pretty mechanical. LoadRunner's default language is C, so you can just build a small program to copy the file from it's network location to a local workstation and time out long the "copy" of the file takes to complete.
If this is coming from an HTTP source then it is even easier to build the script. Set your HTTP proxy for the app (or global proxy) to the LoadRunner proxy recorder port and record the download.
When you want to "replay" your test, you will likely want to have 15 load generators with one virtual user each for testing. Otherwise you may create a bottleneck on the local network interface for a large download. However, before testing I want to offer the following suggestions. If the download is coming from a remote location, involving a wide area network from the client workstations, then consider "pushing" the MSI to a local server overnight and then having the workstations "download" or "pull" from the file server located on the other side of the slow congested link. If this is an issue of an HTTP download, then consider the use of your own internal content distribution network to "stage" the download of the file. In the evenings the file can be pre-seeded to the CDN with an expiration of 18 hours and the users will wind up pulling it from the CDN.
For internal content distribution network use SQUID and VARNISH are very popular solutions. You can also use NginX web server in a cache model. If you have Riverbed also check out their solutions. The idea here is that the person is pulling the data from the local cached image, which was seeded either deliberately or by the first person who downloaded the file.
The problem you present looks to be a pretty classical model of network congestion on a common link which a lot of people are pulling the same file. If you can place the source for morning download on the other side of the congested link, either on a file server or some sort of cache (architecturally dependent on how the file is downloaded) then you should be able to improve performance quite quickly, even without a performance test.
Replace ineffective offshore contracts, LoadRunnerByTheHour
. Starting @ $19.95/hr USD.
Put us to the test, skilled expertise is less expensive than you might imagine.
Twitter: @LoadRunnerBTH @PerfBytes
The customers are pretty stubborn and doesn’t want to alter their systems much but want significant improvements. Something to do with their security systems, etc. I guess in this business, many of us face this kind of obstacle a lot. It all boils down to $$$.
They did some work with loadrunner but that research was not good because they are not researching loads on single workstation per se but need to research and solve the problem causing the slow downloads when many workstations are pulling from a common source via a network. Oh yes the problem is a classical network and bottleneck issue, but they don’t want to hear that. So here I am scratching my heads to come up with 'simple solutions' if there is one for the problem at hand. And they want it tested before implementation. The main problem here is replicating the many workstations. At moment we are putting the file on a designated proxy on the other side of the bottleneck as you suggested. This seems to help pretty good, but the customers call this a 'Band-Aid' to the problem and are looking for better fix. We are still working on it. I will update you guys with any new strategy. Meanwhile I still welcome any more views.
No architecture changes:
A. Buy fatter network pipe
B. If performance insufficient then see A
Otherwise, a change in performance will likely require a change in architecture (such as the location of the file to download)