This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Performane testing - average response time minimum and maximum (industry standard)

Hi All

I have been looking for any documentation related to industry standard as far as the recommended average response time and the maximum response time for an application when performance testing is done. 

Does anyone know where I can find these  ? or know the recommended values

thanks

Mai

  • 0  

    Hi  , You can have a look here and explore the references, not all references exist any more unfortunate.

    In our lab we do not look to average (mean) response times but for UI we look to 90 percentiles and for API to 95% percentiles. This kind of data is also common to use with SLOs / SLAa (Service Level Objectives/Agreements).

    We normally expect for UI based page navigation to < 0.1s and page actions that reach out to backend < 3.0s. Expensive actions like saving/updating with ending a transaction depends more on the amount of work and can be > 3.0s.

    For UI, the developer can play with the "perceived performance" to influence user experience.

    API should be much faster < 0.010 seconds for short lookup, but also highly depends on the work behind the call. I think for API, predictable response times is more important and there should be only very less exceptions.


    You normally expect that this is part of the requirement of the product (team).

    A quote out "Introduction to Computer Performance Analysis with Mathematica" by Arnold Allan: "It is part of the folklore of capacity planning that the perceived value of the average response time experienced is the 90th percentile value of the actual value. If the response time has an exponential distribution (a common occurrence) then the 90th percentile value is 2.3 times the average value. Thus, if a user has experienced a long sequence of exponentially distributed response times with an average value of 2 seconds, the user will perceive an average response time of 4.6 seconds! The reason for this is as follows: Although only 1 out of 10 response times exceeds 4.6 seconds, these long response times make a bigger impression on the memory than the 9 out of 10 that are smaller. "

    How to ask questions

    Reward contributions via likes or 'verified answers'

  • 0 in reply to   

    Hi  

    Thank you for your response. I've looked at the attached links 

    I totally agree that the product team is who we need to work with and based on their requirements we can put these guidelines (recommended average response and maximum response or 90 percentile/95 percentile target values).

    However usually the first question we are asked in any project is what is the industry standard, what do you recommend. Most teams don't know what numbers they should aim for  and they do need some kind of high level guideline to follow. We have been using 

    average response time of 3 seconds ( I think we may start to look at the 90 percentile/95 percentile for this response time) and a maxmum of 5 seconds or less

    I am referring here to a web application. I know it isn't a simple as that and depending on the task in hand eg generating report, searching etc , the numbers may vary. However as a general guideline most teams we work with need some kind of numbers to refer too. Then if anything is greater than 5 seconds it is flagged and looked into. (It may be valid to be greater than 5 seconds depending on the scenario)

    what do you think ? I feel like these guidelines still apply

    thanks for your help, as always Relaxed

    Mai 

  • Verified Answer

    +1   in reply to 

    Hi ,

    It's always a pleasure to assist you! I think that when you work with 3 sec on 90/95 pct you set good end-user expectations. The max of 5 sec might be doable as well.

    Did you know that you can set SLAs in LRE tests? First run your test once so the transaction names are 'registered' and then you can specify SLA on each transaction. Different types are available.

    When you agree on SLA during intake of the project you can always adapt figures during project execution. Most dev teams are reasonable and customer/end user focused and willing to improve.

    Always look open minded to the performance of an application and play the role of the customer's advocate. On paper one can make positive test results, but in the end the customer pays and needs to use it. This role is sometimes difficult and it is nice when you can backup decisions with literature.

    Two old books, but still valid are "High Performance Web Sites" and "Even Faster Web Sites" from Steve Souders. In the latter he quotes Jakob Nielsen. The following timings are mentioned.

    Garret Rempel  wrote Defining Standards for Web Page Performance in Business Applications.

    Success, Erik

    How to ask questions

    Reward contributions via likes or 'verified answers'

  • 0 in reply to   

    thanks so much . Yes I am aware of the SLA feature, we just haven't used it , we will look into doing that