Is it possible to manage a custom .h file in ALM+PC environment from one location?
We just upgraded from LR v11 to PC v11 integrated with ALM.
My LR scripts were using a custom .h file that was called after each page to do the error validation and error handling. I had that file on a network share and within the globlas.h, used the UNC path to access that file and had no issues with that setup from multiple generators.
Now in the PC environment using ALM to execute the test, I receive an error stating that the folder or file cannot be found. I have given admin and full access to the ALM\PC service account to the network share and still receive the same error.
Is there a way for me to continue to use one custom .h file and have all my scripts and/or load generators access that one file?
I want to stay away from having to update the same .h file on multiple generators whenever I need to add\change functions within that file.
Is my approach using a custom .h file to do the error handling an appropriate solution?
I have been searching for over a week with no success, so any help is greatly appreciated.
I've had issues with accessing files across the network, as well. If you don't want to include the .h file with the files being uploaded, you may be stuck with having to deal with a copy on each Load Generator (which is not ideal, I know). You could always automate the deployment of the file. You can drop the file onto your network share, but then use a quick copy program to copy it out to all the load generators in your "Load Farm". Then your scripts just refer to it as being on, say, "C:\\MyLib.h". You just have to remember to 1) run the copy program whenever you make changes and 2) update the copy program when you add or remove machines from your Load Farm. Of course anyone that works on the scripts has to make sure they have the latest-and-greatest copy of the .h file on their machine, too.
SoCalGal - Defender of end user response times!