QALoad can only have one central datapool active at a time, right?
In a normal web application, I want to have the following going on all at the same time:
1. some VUs logging in and out
2. some VUs searching for entities
3. some VUs browsing through entities and the different pages
4. some VUs updating/changes entities
5. some VUs adding new entities
Would it be better to have one central datapool that has all usernames and passwords, all of the entity information required for searching, all of the data I want changed, and all of the data I want added into ONE CENTRAL datapool - or to have different local datapools for each task?
If I go the separate datapool route, do I have to manually copy the datapools to the player machines or will the necessary information be sent automatically from the Conductor machine?
The benefit of the central datapool is mostly that it can provide unique data to each VU accross multiple scripts by using the Strip option in Conductor. So if you needed unique login IDs for each VU, that would be a good canditate - if it doesn't matter which script uses which login. Another use is for shared info, such as a URL that can be changed in 1 datapool that applies to all scripts. In this case you could use the Rewind option in Conductor with only 1 data record in the file. Or you could do both by repeating the URL in each line and still using the Strip option.
But since you can only use 1 central datapool per Conductor session, the options are limited. Also, since datapool usage has to be compiled into the scripts, the usage needs to be determned before-hand. There is really nothing that you can't do with local datapools, but you may have more repetition of data in the files...
No, Conductor will copy all datapools (central and local) from the Conductor PC to the player PCs.
A problem is a difference between what is perceived and what is desired, that
we want to reduce (Dewey 1933)