I'm using Jmeter 2.3.2 on XP. I'm trying to process the CSV file generated by Jmeter listeners, but I have some problems with Throughput calculation.

I found some definitions like:

*Throughput = Number of requests / Total time to issue the requests.
*For a single request, the throughput depends only on the time to issue a single request.
*For multiple requests, the throughput depends also on the gaps between the requests. E.g. if a request is issued every 30 seconds, the throughput will be 2/min, and is largely independent of the time each request takes.

I understood the definition for a single request (1/elapsed), but I don't understand very well how calculate it when I have multiple requests.

I tried with:

*select a label name
*search the min (timestamp) for labels with the name selected
*search the max (timestamp + elapsed time) for labels with the name selected
*calculate the difference between max and min
*calculate Throughput= (number of samples/ difference between max and min) * 1000

It works well when I have 1 sample, but doesn't work when the number of samples is greater than 1.
With that code, for a label I have Throughput = 0.0999170557547; with Jmeter I have Throughput= 0.03220809002805324.

I missed something?

Somebody know the formula to calculate the Throughput for multiple requests?