| || |
Importing dynamic .csv files in SP script
I'm testing a function which requires each VU to import a csv file from their local drive/network. This data has to match mainframe data, so it has to be different for each VU and may change from test to test. When recording this import, header and footer data is inserted before the actual csv data in the Form input within the script - as this data can't be extracted via a Parse from the any previous functions or transactions during the load test, I may have to record an import for each VU and store the data inside the Form input. Obviously easier to import the data via an SP function instead - has anyone a solution to this?
Re: Importing dynamic .csv files in SP script
If I understand your problem correctly, you just need to have a different CSV file for each VU? (If I don't understand it correctly, please restate the problem.)
You could create separate files for each VU and then use GetUserID to make sure that each user gets a distinct number and, therefore, a distinct file to retrieve data from. (Or you could put all of the data in one file and use GetUserID for each user to get a specific line from the file.)
Additionally, if you use a hard-coded filename (e.g. "c:\\foo1.csv") SilkPerformer will throw a warning at compile time, but will also retrieve info from said file at runtime.
A concerned Borland customer, a fly in the ointment, a wrench in the works.