Does web_service_call has any attribute that holds request which is sent to server
Can any one let me know if web_service_call has any attribute that contains the service request(XML) that is being sent to the service.
For example "response param" attribute of a web_service_call holds the response of the particular service request.
Actually my requirement is to write the requests(xml) and corresponding responses(xml) to the log file. I am able to write the response to the file and I am facing difficulty with the request.
I can build the request in a string and write it to a log, but the problem here is my xml has a huge data i.e. around 400 to 500 of lines and I need to do the same for around 30 services. So I am looking for an alternate way.
Please suggest any other alternate ways to capture the request to write it to a log file.
As noted in this forum and others....you may write any information you want to a log, provided it is a string. There are many logging functions which can be found in the lrun.h header file in the <loadrunner home>\include subdirectory. Depending upon which log function you choose this will direct how and where the information is logged/collected.
Decouple your XML from your request. Store the information in a variable. Use that variable to populate both your request and the output to a log. If you are not careful you will turn your entire load generator into a bottleneck for your test as you have tens/hundreds/thousands of users clamoring to write information to the disk with the majority of the threads/users having to wait until the current write operation is completed. This is why it is recommend that absolutely minimal logging be involved in your test and also why the parameter files are loaded into RAM and accessed in RAM instead of being accessed from disk for the users during the test.
With the requirement should be noted the business or technical reason for the requirement. What is this rationale? This appears to be a functional requirement perhaps included by someone who does not understand or who has never engaged in performance testing: Having people who do not understand what you do or how you do it directing you in what you do is a sure path to failure. Is this requirement tied to ensuring that the XML sent is in proper form or content? In performance testing we should have already validated that the site/application/interface is functionally correct, for if it cannot work for one then it will never scale to many.
This answer leverages several foundation skills: Programming in the language of your testing tool and Performance Test Design (independent of tool).
HERE IS THE EASY WAY TO COLLECT ALL OF THE REQUESTS MADE DURING YOUR TEST
At the end of the test pull the web server logs for the application server/web server instances which were the target for your web services calls. Each and every request should be logged in the web server log. So, in this respect it becomes self documenting for your test. All you need to do is pull the logs in between the start and stop times of the test.