0% found this document useful (0 votes)
53 views

Performance Assessment SAMPLE

The document provides a summary of a load test performed on a B2B ecommerce application using JMeter. Key findings include: 1) The system performed better than previous tests, reaching 500 concurrent users. 2) Database response times were quite slow, averaging 2-3 seconds and sometimes exceeding 4-6 seconds. 3) CPU usage remained stable below 20% in contrast to previous tests averaging 60-80%. 4) Memory usage did not approach maximum levels. Recommendations focus on resolving slow database response times and optimizing Solr configurations.

Uploaded by

navaneet
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views

Performance Assessment SAMPLE

The document provides a summary of a load test performed on a B2B ecommerce application using JMeter. Key findings include: 1) The system performed better than previous tests, reaching 500 concurrent users. 2) Database response times were quite slow, averaging 2-3 seconds and sometimes exceeding 4-6 seconds. 3) CPU usage remained stable below 20% in contrast to previous tests averaging 60-80%. 4) Memory usage did not approach maximum levels. Recommendations focus on resolving slow database response times and optimizing Solr configurations.

Uploaded by

navaneet
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 12

XXXXXXXXXX B2B Performance

Assessment

CONFIDENTIAL

June, 2014

Page 1

Document Revision History


Date
Jun 6, 2014
Jun 15,
2014

CONFIDENTIAL

Revision
1.0
1.1

Author
Tyler MacWilliam
Ryan Glen

Summary of Changes
Initial Draft
Added JMeter Summary Results

June, 2014

Page 2

Table of Contents
1

Test Description........................................................................................................4

Findings....................................................................................................................5
2.1 Overall Performance.........................................................................................5
2.2 CPU...................................................................................................................5
2.3 Memory............................................................................................................5
2.4 Database..........................................................................................................6
2.5 Garbage Collection...........................................................................................7

Metrics......................................................................................................................9
3.1 Tool...................................................................................................................9
3.2 Scripts..............................................................................................................9
3.3 Profile Breakdown.............................................................................................9
3.4 Results............................................................................................................10

Recommendations..................................................................................................11
4.1 Database Connection.....................................................................................11
4.2 Solr Configurations.........................................................................................11
4.3 Page Caching..................................................................................................11
4.4 Granule...........................................................................................................11
4.5 Profiler............................................................................................................11

CONFIDENTIAL

June, 2014

Page 3

1 Test Description
This test was run using the JMeter software to simulate 500 users. This test represented
the baseline load test for the B2B Webstore application.
Test Date :
Test Duration :
Start Time :
Ramp Up Duration :
Peak Users:
Peak Load Duration :

CONFIDENTIAL

6/6/2014
40 Minutes

8:40 PST
First 10 Minutes
483
10 Minutes

June, 2014

Page 4

2 Findings
2.1 Overall Performance
The system performed better than it has in previous tests. We were able to reach 500
concurrent users. That being said there are still some issues that need to be resolved.
The following issues were identified.
1. Database: Response times to the database were quite slow for a majority of the
test. This has been an issue we have seen in the past. Given the improvements to
the code and caching there were far fewer connections required, however those
that did require a connection continued to run slow.

2.2

CPU

CPU remained stable for the duration of the test on both servers, barely making it past
20%. This is in contrast to previous tests where we saw CPU average ~60% and
sometimes exceed 80%.
x.y.z.4

x.y.z.5

The highest CPU usage was for the following stack while trying to render the
LandingLayout1Page:

2.3 Memory
There did not appear to be any issues with memory. Heap and non-heap did not approach
max usage, even with the increase in cache size.

CONFIDENTIAL

June, 2014

Page 5

x.y.z.4

x.y.z.5

2.4 Database
2.4.1 Connections
Connections to the database were down significantly from previous tests, thanks to code
improvements and the larger query cache. The max was around 291/sec, whereas in
previous tests it hit ~3500/sec. Additionally, as more users connected the number of
connections remained relatively the same, whereas previous tests the connections kept
increasing. You can see that x.y.z.5 had fewer connections, but this was likely due to the
thread locking issue with the database (see below)
x.y.z.4

x.y.z.5

CONFIDENTIAL

June, 2014

Page 6

2.4.2 Response Times


In this test x.y.z.4 had a better overall response and max times for database queries,
however the max times are quite long (~4-6secs). Slow database connections can
significantly impact performance of the application. In the case of this test x.y.z.5 had a
large number of threads that were blocked because they were waiting for responses from
the database. You can also see that x.y.z.5 had a much slower average database query
response time of 2-3secs.
x.y.z.4

x.y.z.5

Just prior to running the test a SQL Max performance test was run on both servers. This
test has hybris request to insert 10,000 rows into the database and records the time to do
so. Here are the results:
x.y.z.4
Time to add 10,000 rows
10622 ms
Time to add 10,000 rows using max() queries
68109 ms
Time to add 10,000 rows using max() queries and index
11339 ms
x.y.z.5
Time to add 10,000 rows
21,554 ms
Time to add 10,000 rows using max() queries
136,891 ms
Time to add 10,000 rows using max() queries and index
32,204 ms
Unfortunately the stats we collect internally for this test are against a MySQL server, not
MS SQL Server. However the results against a MySQL server are typically <5s, 40-50s and
<10s, respectively. Your results are much higher which could contribute to a slow

CONFIDENTIAL

June, 2014

Page 7

response time. We will continue to look for MS SQL Server times to compare, but there is
a significant difference between what we normally see and what is being seen at
XXXXXXXXXX.

2.5 Garbage Collection


There were no issues identified with garbage collection during this test. The test did not
require a major collection and this could be seen as a risk given that we dont know how
the app would respond at full load. We have seen a major collection occur in the past at
half load with no issues so the assumption would be that there should not be an issue.

CONFIDENTIAL

June, 2014

Page 8

3 Metrics
3.1 Tool
Performance tests were done using Jmeter (open source)
Performance tests were executed in the Pre-production environment.

3.2 Scripts
Scripts were generated to simulate various user profiles that would test the various test
objectives.
1) Full Order Flow User logs in and searches for SKU and adds to cart, this is repeated 20
times then user navigates to Checkout and completes the order, then logs out
2) Quick Order Upload User logs in and uploads a CSV file in Quick Order for 20 SKUs,
adds to cart and then proceeds to checkout and completes the order, then logs out
3) Quick Order Manual User logs in and manually enter 10 SKUs, adds to cart and then
proceeds to cart and removes the item from cart and logs out
4) Product Views - User logs in and searches for SKU, Opens the Product Detail Page,
repeats 20 times with different SKUs and then logs out.
5) SKU Look Up - User logs in performs a SKU Look Up, repeats 20 times with different
SKUs and then logs out.
6) Site Navigation - User logs in, Performs Keyword Searches, Resorts results, navigate
result pages, performs search refinements, Searches and opens orders in Order History,
navigates various content pages and then logs out.
7) DM Login DM User logs in, navigates various content pages, performs invoice
searches, opens invoice details and then logs out.
8) Invoice Searches - User logs in, performs various invoice searches opens invoice details
and then logs out.

3.3 Profile Breakdown


Profile

% of Total Users

Full Order Flow

30

Quick Order Upload

Quick Order Manual

30

Product Views

10

SKU Look Up

12

Site Navigation

10

DM Login

CONFIDENTIAL

June, 2014

Page 9

Invoice Searches

3.4 Results
The results were recorded and snapshots taken at 50, 100, 150, 200, 300, 400, 500 and a
final where 500 users had been running for a period of 10 minutes.
The results were compared with the original proposed benchmarks and where the
benchmarks were not met identified.
Time
(ms)
Test
Objecti
ve
Homepage

4000

Category and Product


pages

1000

Add to cart

50

100

150

200

300

400

500

Final

267

308

207

186

160

278

608

760

319

445

380

468

563

678

932

998

1200

904

1794

2591

3950

3769

4671

6383

6747

1200

1944
5

2260
4

2658
4

3842
7

3952
5

4443
7

4441
0

5826
5

Inventory Check

1500

79

93

86

127

279

423

1002

1518

Search

1000

467

670

675

892

1176

1335

2302

2797

4252

4886

5470

8327

5683

1273
0

1249
9

1361
6

264

278

437

1330

3069

1116
0

3664
3

4287
3

111

149

158

394

2052

5151

1591
4

2375
2

1157

1418

1827

3308

4330

1243
2

5342
8

6185
8

566

586

866

926

1268

1546

1430

1569

Quick add

Place order

View order history

View Invoice history

Login

Order Upload (Excel)

CONFIDENTIAL

2000

1200

4000

4000
5000

June, 2014

Page 10

4 Recommendations
Our recommendations for changes prior to the next test run are as follows:

4.1 Database Connection


The DBAs and Network teams should work together to try to resolve the slow database
response times, especially for x.y.z.5. The performance test only lasted 30mins, but given
the number of blocked threads waiting for responses from the database this would cause
major issues once you have gone live and the site is up for more than an hour at the 500
concurrent users.

4.2 Solr Configurations


There is a known issue in hybris that results in the solr configurations being pulled taken
and converted with each search request. This conversion is in place because it is required
to fit in with the way hybris code is architected, but the constant conversions leads to
performance hits. hybris PS has passed along code to cache the configurations so the
conversion is only necessary when the configurations dont exist in the cache.

4.3 Page Caching


Given that some of the highest CPU is for rendering a landing page it is recommended
that options for page caching are reviewed. It is possible to cache the static elements of
the page, leading to significant response time improvements. Caching can be done via a
CDN (like Akamai) or via a caching server like Varnish, NGINX, etc. If neither of those
options are satisfactory hybris PS has a package for implementing a page fragment
caching for some standard components (header, footer, etc).

4.4 Granule
Given that a CDN is not currently in place it is recommended that granule or some other
javascript/CSS compression tool be utilized so as to speed up loading of web pages.

4.5 Profiler
It is strongly recommended that a system profiler be run in production to help both the
hybris and XXXXXXXXXX operations team understand how the site is performing. The
profiler should include the ability to see response times from the app server as well as
end user response times. It should include the ability to monitor key stats such as CPU,
memory as well as count any exceptions thrown by the application. hybris PS typically
recommends NewRelic or Dynatrace. Both require licenses and some lead time to get
configured but without a profiler you will have a much more difficult time of
understanding how your site is performing as well as how to troubleshoot issues that at
some point occur.

CONFIDENTIAL

June, 2014

Page 11

CONFIDENTIAL

June, 2014

Page 12

You might also like