multithreading - Performance testing - Jmeter results



I am using Jmeter (started using it a few days ago) as a tool to simulate a load of 30 threads using a csv data file that contains login credentials for 3 system users.

The objective I set out to achieve was to measure 30 users (threads) logging in and navigating to a page via the menu over a time span of 30 seconds.

I have set my thread group as:

Number of threads: 30
Ramp-up Perod: 30
Loop Count: 10

I ran the test successfully. Now I'd like to understand what the results mean and what is classed as good/bad measurements, and what can be suggested to improve the results. Below is a table of the results collated in the Summary report of Jmeter.

I have conducted research only to find blogs/sites telling me the same info as what is defined on the site. One blog (Nicolas Vahlas) that I came across gave me some very useful information,but still hasn't help me understand what to do next with my results.

Can anyone help me understand these results and what I could do next following the execution of this test plan? Or point me in the right direction of an informative blog/site that will help me understand what to do next.

Many thanks.

enter image description here

enter image description here

3 Answers: 

According to me, Deviation is high.

You know your application better than all of us.

you should focus on, avg response time you got and max response frequency and value are acceptable to you and your users? This applies to throughput also.

It shows average response time is below 0.5 seconds and maximum response time is also below 1 second which are generally acceptable but that should be defined by you (Is it acceptable by your users). If answer is yes, try with more load to check scaling.


In you requirement it is mentioned that you need have 30 concurrent users performing different actions. The response time of your requests is less and you have ramp-up of 30 seconds. Can you please check total active threads during the test. I believe the time for which there will be 30 concurrent users in system is pretty short so the average response time that you are seeing seems to be misleading. I would suggest you run a test for some more time so that there will be 30 concurrent users in the system and that would be correct reading as per your requirements.

You can use Aggregate report instead of summary report. In performance testing

  1. Throughput - Requests/Second
  2. Response Time - 90th Percentile and
  3. Target application resource utilization (CPU, Processor Queue Length and Memory)

can be used for analysis. Normally SLA for websites is 3 seconds but this requirement changes from application to application.


Your test results are good, considering if the users are actually logging into system/portal.

Samples: This means the no. of requests sent on a particular module.

Average: Average Response Time, for 300 samples.

Min: Min Response Time, among 300 samples (fastest among 300 samples).

Max: Max Response Time, among 300 samples (slowest among 300 samples).

Standard Deviation: A measure of the variation (for 300 samples).

Error: failure %age

Throughput: No. of request processed per second.

Hope this will help.