A bit confused. I have a few Loadrunner Analysis from a report I've run. I'm new to testing. My understanding of the 90th percentile is that, given that it takes the 90th percentile and leaves out the outliers, it presents a truer picture. Although I'm looking at two different reports and in both, the 90th percentile response time is higher than the average response time given in the Summary Report. How can that be possible?
I'm looking at the graph of transaction response times (Percentile) and the last 10% shoot up, therefore telling me that taking the 90% should see a lower response time.
Example
Transaction 1
Min 0.012
Avg 1.919
Max 20.935
SD 2.718
90 Percentile 6.412
A lot of the transactions look like this, more-or-less. Why is the 90th percentile higher than the average?