0% found this document useful (0 votes)
23 views22 pages

Annex 9 - Performance and Load Test Plan

fg

Uploaded by

Neritan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views22 pages

Annex 9 - Performance and Load Test Plan

fg

Uploaded by

Neritan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 22

Supply and Installation of Integrated Front

Office Service Delivery Platform

SUPPLY AND INSTALLATION OF INTEGRATED


FRONT OFFICE SERVICE DELIVERY
PLATFORM

Performance and Load Test Results


Final Report
Version 1.0

ikubINFO

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Audit Trail:

Date Version Name Comments

1.0 Roland Mai Initial Revision

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Table of Contents

1. References.............................................................................................................................5

2. Summary...............................................................................................................................5

3. Tools Used............................................................................................................................6

4. JMeter Glossary....................................................................................................................7

5. Test Results Analysis............................................................................................................9

6. Over Time Performance Charts..........................................................................................14

7. Throughput Charts..............................................................................................................18

8. Response Times Charts.......................................................................................................22

9. Conclusions.........................................................................................................................25

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Table of Figures

Figure 1 Test and Report Information.............................................................................................9


Figure 2 APDEX............................................................................................................................10
Figure 3 Requests Summary..........................................................................................................11
Figure 4 Performance Statistics.....................................................................................................12
Figure 5 Top Errors of the Test.....................................................................................................13
Figure 6 Top 5 Errors by Sampler.................................................................................................14
Figure 7 Response Time over Time...............................................................................................15
Figure 8 Response Time Percentiles over Time............................................................................16
Figure 9 Active Threads over Time...............................................................................................17
Figure 10 Bytes Throughput over Time........................................................................................17
Figure 11 Latencies over Time......................................................................................................18
Figure 12 Connect Time over Time...............................................................................................18
Figure 13 Hits per Second.............................................................................................................19
Figure 14 Codes per Second..........................................................................................................20
Figure 15 Transactions per Second................................................................................................20
Figure 16 Total Transactions per Second......................................................................................21
Figure 17 Response Time Vs Request...........................................................................................21
Figure 18 Latency Vs Request.......................................................................................................22
Figure 19 Response Time Percentiles............................................................................................23
Figure 20 Response Time Overview.............................................................................................23
Figure 21 Response Times Vs Threads.........................................................................................24
Figure 22 Response Time Distribution..........................................................................................24

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

1. References

i. Adisa.csv
ii. Adisa.jmx
iii. HtmlReports

2. Summary

Performance testing was conducted on ADISA to determine the baseline performance of ADISA.
Testing was done with a basic set of tools configured in each worksite. Concurrent user testing
began with a small number of users and gradually increased to support more and more users.
This process also helped to debug the test environment itself, fixing errors in configuration and
fine-tuning. The results show a stable condition of the system. However, the results also show no
dramatic decrease in performance as more users are added to the test. As a result, a max of 30
users was given in the final tests that were run. In addition to debugging the test environment and
gathering the initial test results, three bugs may have been uncovered. These bugs would most
likely not have been visible in normal functional testing. Therefore, performance testing can
claim an additional measure of success in discovering bugs that might, otherwise, have gone
unnoticed.

3. Tools Used

The Apache JMeter™ application is open-source software, a 100% pure Java application
designed to load test functional behavior and measure performance. It was originally designed
for testing Web Applications but has since expanded to other test functions.
Apache JMeter may be used to test performance both on static and dynamic resources, Web
dynamic applications.

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

It can be used to simulate a heavy load on a server, group of servers, network or object to test its
strength or to analyze overall performance under different load types.

Apache JMeter features include:


 Ability to load and performance test many different applications/server/protocol types:
 Web - HTTP, HTTPS (Java, NodeJS, PHP, ASP.NET, …)
 SOAP / REST Webservices
 FTP
 Database via JDBC
 LDAP
 Message-oriented middleware (MOM) via JMS
 Mail - SMTP(S), POP3(S) and IMAP(S)
 Native commands or shell scripts
 TCP
 Java Objects

4. JMeter Glossary

APDEX (Application Performance Index) - is an open standard for measuring the performance
of software applications in computing. Its purpose is to convert measurements into insights about
user satisfaction, by specifying a uniform way to analyze and report on the degree to which
measured performance meets user expectations.

Response Times over Time - Graph that will display for each sampler the average response
time in milliseconds.

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Response Times Percentiles - Graph that will display the percentiles for the response time
values. The X-Axis represents a percentage, Y-Axis Response time values. One point (P, Value)
means for the whole scenario, P percent of the values are bellow Value ms.

Active Threads Over Time - is a simple listener showing how many active threads are there in
each thread group during a test run.

Response Times vs Threads - Graph that shows how Response Time changes with the number
of parallel threads. Naturally, the server takes longer to respond when many users request it
simultaneously.

Latency Vs Request - Latency is generally considered the amount of time it takes from when
the user makes a request to the time it takes the response to get back to that user. On a first
request, for the first 14Kb bytes, latency is longer because it includes a DNS lookup, a TCP
handshake, and the secure TLS negotiation. Subsequent requests will have less latency because
the connection to the server is already set.

Time Vs Threads - Graph that shows how Response Time changes with the number of parallel
threads. Naturally, the server takes longer to respond when many users request it simultaneously.

Response Times Distribution - Graph that will display the response time distribution of the test.
The X-axis shows the response times grouped by interval and the Y-axis the number of samples,
which are contained in each interval.

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

5. Test Results Analysis

APDEX (Application Performance Index) is an open standard for measuring performance of


software applications in computing. Its purpose is to convert measurements into insights about
user satisfaction, by specifying a uniform way to analyze and report on the degree to which
measured performance meets user expectations.

Figure 1 Test and Report Information

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Figure 2 APDEX

Figure 3 Requests Summary

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

A Statistics table providing in one table a summary of all metrics per transaction including
7 configurable percentiles.

Figure 4 Performance Statistics

An error table providing a summary of all errors and their proportion in the total requests.

Figure 5 Top Errors of the Test


10

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

A Top 5 Errors by Sampler table providing for every Sampler (excluding Transaction
Controller by default) the top 5 Errors.

Figure 6 Top 5 Errors by Sampler

11

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

6. Over Time Performance Charts

Table reports are more informative in JMeter, as "Summary Report" shows. This report displays
all main indicators for all requests in the Test Plan and includes quantity of sent requests. In the
table, you can find bottlenecks or other problems at once and solve them immediately.

This graph will display for each sampler the average response time in milliseconds.

Figure 7 Response Time over Time

This graph will display the percentiles for the response time values. X Axis represents
percentage, Y Axis Response time values. One point (P, Value) means for the whole
scenario, P percent of the values are bellow Value ms.

12

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Figure 8 Response Time Percentiles over Time

Active Threads over Time is a simple listener showing how many active threads are there
in each thread group during test run.

Figure 9 Active Threads over Time

13

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

This graph will display the number of bytes sent and received by JMeter during the load
test.

Figure 10 Bytes Throughput over Time

This graph will display the response latencies during the load test. A latency is the duration
between the end of the request and the beginning of the server response.

Figure 11 Latencies over Time

This graph will display the average time to establish connection during the load test.

14

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Figure 12 Connect Time over Time

7. Throughput Charts
Throughput is one of the components in the JMeter of non-functional requirement, which we can
consider under the performance-testing category, and its formula to calculate is:
Total number of requests in a given time or TPS (transaction per second)
It is the term in the JMeter, we use it to measure the performance on the server by putting the
load on the server or in another way, and we can say it is the term, which tells us the ability of
the application to handle the load.
Throughput is the significant way of finding the performance of the application, higher the
throughput good result is considered although throughput may vary from the number of threads
per second.
Throughput is considered as the number of requests sent to the server per second.
The formula is  Throughput = (number of requests) / (total time).

This graph will display the hits generated by the test plan to the server per second. Hits
includes child samples from transactions and embedded resources hits.

15

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Figure 13 Hits per Second

HTTP Codes per second over time (200 OK, 500 Internal Error etc.)

Figure 14 Codes per Second

This graph shows the number of transactions per second for each sampler. It counts for
each seconds the number of finished transactions.

16

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Figure 15 Transactions per Second

This graph shows the number of transactions per second for each sampler. It counts for
each seconds the number of finished transactions.

Figure 16 Total Transactions per Second

This graph shows how Response Time changes with amount of parallel threads. Naturally,
server takes longer to respond when many users requests it simultaneously. This graph
visualizes such dependencies.

17

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Figure 17 Response Time Vs Request

This graph shows how latency compared to requests per second.

Figure 18 Latency Vs Request

8. Response Times Charts

18

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Measures the time taken for one system node to respond to the request of another. It is the time a
system takes to reach a specific input until the process is over. For example, you have API, and
you want to know exactly how much time it takes to execute it and return data in JSON.
Response Time measures the server response of every single transaction or query.

Response time starts when a user sends a request and ends at the time that the application states
that the request has completed.

This graph will display the percentiles for the response time values. X Axis represents
percentage, Y Axis Response time values. One point (P, Value) means for the whole
scenario, P percent of the values are bellow Value ms.

Figure 19 Response Time Percentiles

This graph will display the response time distribution of the test. The X-axis shows the
response times grouped by interval and the Y-axis the number of samples, which are
contained in each interval.

19

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Figure 20 Response Time Overview

This graph shows how Response Time changes with amount of parallel threads. Naturally,
server takes longer to respond when many users requests it simultaneously. This graph
visualizes such dependencies.

Figure 21 Response Times Vs Threads

This graph will display the response time distribution of the test. The X-axis shows the
response times grouped by interval and the Y-axis the number of samples, which are
contained in each interval.

20

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

Figure 22 Response Time Distribution

9. Conclusions

 Threads Sample - number of requests sent


On this performance test for 5 minutes, the test which as an average length of creating a new
application were sent a total number of 545393 threads.
 Avg - an Arithmetic mean for all responses
An average response from all the requests were 301.64.
 Minimal response time (ms)
Minimal response time that was required from a sample for a specific label was (0) milliseconds.
 Maximum response time (ms)
The maximum response time that was required from a sample for a specific label was twenty one
(21080) milliseconds.
 Deviation - shows the set of exceptional cases that we are deviating from the average
value of sample response time.
The standard deviation time of the application is 326.00 milliseconds.
 Error rate - the percentage of failed tests

21

Performance and Load Test Results


Supply and Installation of Integrated Front
Office Service Delivery Platform

This test was concluded that was a maximum rate of 4.38% error rate or 23893 samples.
 Throughput - Throughput is the number of requests that are processed per time
unit (seconds, minutes, and hours) by the server. This time is calculated from the
start of the first sample to the end of the last sample.
Were a maximum of 1777.16 requests/s
 KB/Sec: This indicates the amount of data downloaded from the server during the
performance test execution.
The amount of data that the server was able to download during this test was 4648.27 kilobytes.

22

Performance and Load Test Results

You might also like