0% found this document useful (0 votes)
156 views

How To Do Performance Testing

The document outlines the core performance testing lifecycle, which involves 8 key phases: 1) requirement analysis and gathering, 2) tool selection, 3) test planning and design, 4) test development, 5) test modeling, 6) test execution, 7) results analysis, and 8) reporting. It emphasizes that performance testing requires defining business objectives, elaborating requirements, modeling scenarios, scripting tests, executing tests formally, analyzing results, and providing management summaries.

Uploaded by

qabiswajit
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
156 views

How To Do Performance Testing

The document outlines the core performance testing lifecycle, which involves 8 key phases: 1) requirement analysis and gathering, 2) tool selection, 3) test planning and design, 4) test development, 5) test modeling, 6) test execution, 7) results analysis, and 8) reporting. It emphasizes that performance testing requires defining business objectives, elaborating requirements, modeling scenarios, scripting tests, executing tests formally, analyzing results, and providing management summaries.

Uploaded by

qabiswajit
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 11

How to do performance testing & performance test lift cycle

Performance Testing Goals:


It is conducted to accomplish the following goals:
Verify Applications readiness to go live.
Verify if the desired performance criteria are met
Compare performance characteristics/configurations of the application to what is
standard
Identify Performance bottlenecks.
Facilitate Performance Tuning.

#1. Requirement Analysis/Gathering

Performance team interacts with the client for identification and gathering of requirement
technical and business. This includes getting information on applications architecture,
technologies and database used, intended users, functionality, application usage, test
requirement, hardware & software requirements etc.

#2. POC/Tool selection

Once the key functionality are identified, POC (proof of concept which is a sort of
demonstration of the real time activity but in a limited sense) is done with the available
tools. The list of available performance test tools depends on cost of tool, protocol that
application is using, the technologies used to build the application, the number of users
we are simulating for the test, etc. During POC, scripts are created for the identified key
functionality and executed with 10-15 virtual users.

#3. Performance Test Plan & Design

Depending on the information collected in the preceding stages, test planning and
designing is conducted.

Test Planning involves information on how the performance test is going to take place
test environment the application, workload, hardware, etc.

Test designing is mainly about the type of test to be conducted, metrics to be measured,
Metadata, scripts, number of users and the execution plan.

During this activity, a Performance Test Plan is created. This serves as an agreement
before moving ahead and also as a road map for the entire activity. Once created this
document is shared to the client to establish transparency on the type of the application,
test objectives, prerequisites, deliverable, entry and exit criteria, acceptance criteria etc.

Briefly, a performance test plan includes:

a) Introduction (Objective and Scope)


b) Application Overview
c) Performance (Objectives & Goals)

d) Test Approach (User Distribution, Test data requirements, Workload criteria, Entry &
Exit criteria, Deliverable, etc.)
e) In-Scope and Out-of-Scope
f) Test Environment (Configuration, Tool, Hardware, Server Monitoring, Database, test
configuration, etc.)
g) Reporting & Communication
h) Test Metrics
i) Role & Responsibilities
j) Risk & Mitigation
k) Configuration Management

#4. Performance Test Development

Use cases are created for the functionality identified in the test plan as the scope of PT.
These use cases are shared with the client for their approval. This is to make sure the
script will be recorded with correct steps.
Once approved, script development starts with a recording of the steps in use cases with
the performance test tool selected during the POC (Proof of Concepts) and enhanced by
performing Correlation (for handling dynamic value), Parameterization (value
substitution) and custom functions as per the situation or need. More on these
techniques in our video tutorials.
The Scripts are then validated against different users.
Parallel to script creation, performance team also keeps working on setting up of the test
environment (Software and hardware).
Performance team will also take care of Metadata (back-end) through scripts if this
activity is not taken up by the client.
#5. Performance Test Modeling

Performance Load Model is created for the test execution. The main aim of this step is to
validate whether the given Performance metrics (provided by clients) are achieved
during the test or not. There are different approaches to create a Load model. Littles
Law is used in most cases.

#6. Test Execution

The scenario is designed according to the Load Model in Controller or Performance Center
but the initial tests are not executed with maximum users that are in the Load model.

Test execution is done incrementally. For example: If the maximum number of users are
100, the scenarios is first run with 10, 25, 50 users and so on, eventually moving on to
100 users.

#7. Test Results Analysis

Test results are the most important deliverable for the performance tester. This is where
we can prove the ROI (Return on Investment) and productivity that a performance testing
effort can provide.
Some of the best practices that help the result analysis process:

a) A unique and meaningful name to every test result this helps in understanding the
purpose of the test
b) Include the following information in the test result summary:

Reason for the failure/s


Change in the performance of the application compared to the previous test run
Changes made in the test from the point of application build or test environment.
Its a good practice to make a result summary after each test run so that analysis results
are not compiled every time test results are referred.
PT generally requires many test runs to reach at the correct conclusion.
It is good to have the following points in result summary:
Purpose of test
Number of virtual users
Scenario summary
Duration of test
Throughput
Graphs
Graphs comparison
Response Time

Error occurred
Recommendations
There might be recommendations like configuration changes for the next test. Server
logs also help in identifying the root cause of the problem (like bottlenecks) Deep
Diagnostic tools are used for this purpose.

In the final report, all the test summaries are consolidated.

#8. Report

Test results should be simplified so the conclusion is clearer and should not need any
derivation. Development Team needs more information on analysis, comparison of
results, and details of how the results were obtained.

Test report is considered to be good if it is brief, descriptive and to the point.

The following guidelines will smooth this step out:

Use appropriate heading and summary


Report should be presentable so that it can be used in the management meetings.
Provide supporting data to support the results.
Give meaningful names to the table headers.
Share the status report periodically, even with the clients
Report the issues with as much information and evidence as possible in order to avoid
unnecessary correspondence
The final report to be shared with the client has the following information:

Execution Summary
System Under test
Testing Strategy
Summary of test
Results Strategy

Problem Identified
Recommendations
Along with the final report, all the deliverable as per test plan should be shared with the
client.

Conclusion

We hope this article has given a process oriented, conceptual and detailed information on
how performance testing is carried out from beginning to end.
Can We do performance testing manually?*
Yes you can do Performance testing manually. For this you should open many active
sessions of the application and should test it out. It also depends on what type of
performance test you want to do. However, in general you can judge the active sessions,
number of DB connections open, number of threads running (I have taken JAVA based
Web applications as eg), the amount of the CPU time and memory being used by having
a performance viewer. You can have IBM Tivoli Performance viewer. It is available for trial
version also. Usually the the test is done by deploying the application on the server and
accessing the application from multiple client machines and making multiple threads to
run. The performance viewer should of course be installed on the server.
Good performance testing isnt just about generating load onto the system; its about
generating the correct load into a system and achieving accurate results. I view
performance testing as a sub-project in its own right. A performance consultant should
spend time elaborating and fleshing out the requirements; they should also take time to
put the results of a performance test back into the context of the business. A
performance testing assignment is small technical project that requires more than just
scripting skills; it involves communication with disparate parties within a project. Here is
a broad high-level outline of the steps I attempt to put in place when faced with an
assignment:

The Core Performance Testing Lifecycle


Good performance testing isnt just about generating load onto the system; its about
generating the correct load into a system and achieving accurate results. I view
performance testing as a sub-project in its own right. A performance consultant should
spend time elaborating and fleshing out the requirements; they should also take time to
put the results of a performance test back into the context of the business. A
performance testing assignment is small technical project that requires more than just
scripting skills; it involves communication with disparate parties within a project. Here is
a broad high-level outline of the steps I attempt to put in place when faced with an
assignment:

The Core Performance Testing Phase


Performance Testing Requirements Phase:
Define Business Objectives of System
Elaboration of key Performance Requirements and Business flows
High Level Models: Sketch out performance scenarios based on requirements
Decompose performance scenarios into more detailed descriptions (suitable for
performance scripting)
Performance Execution Phase:
Begin Scripting/Coding
Informal Testing
Formal Testing

Results and Summary Phase


Formal Performance Results
Management Summary

It may seem like a lot of steps just to execute a performance test, but I find that the
documents produced do not have to be extensive they just have to be a few pages.
When to start Performance Testing?
Performance Testing starts parallel with Software Development Life Cycle (SDLC). NFR
elicitation happens parallel with System Requirement Specification (SRS).

Now we will see the phases of Performance Testing Life Cycle (PTLC).

1. Non-Functional Requirements Elicitation and Analysis


2. Performance Test Strategy
3. Performance Test Design
4. Performance Test Execution
5. Performance Test Result Analysis
6. Benchmarks and Recommendations

1.

Non-Functional Requirements Elicitation and Analysis

Understanding non-functional requirement is the inception and most critical phase in


PTLC.

Entry Criteria

Application Under Test (AUT) Architecture

Non-Functional Requirement Questionnaire

Tasks

Understanding AUT architecture

Identification of critical scenarios and understanding

Understanding Interface details

Growth pattern

Exit Criteria

2.

Client signed-off NFR document


Performance Test Strategy

This phase defined how to approach Performance Testing for the identified critical
scenarios. Following are to be addressed during this phase.

1. What kind of performance testing?


2. Performance tool selection

3. Hardware and software environment set up


Entry Criteria

Signed-off NFR document

Prepare the Test Strategy and Review

Data set up

Defining in-scope and out-scope

SLA

Workload Model

Prepare Risks and Mitigation and Review

Exit Criteria

3.

Baselined Performance Test Strategy doc


Performance Test Design

This phase involves with the script generation using identified testing tool in a dedicated
environment. All the script enhancements should be done and unit tested.

Entry Criteria

Baselined Test Strategy

Test Environment

Test Data

Activities

Test Scripting

Data Parameterization

Correlation

Designing the action and transactions

Unit Testing

Exit Criteria

4.

Unit tested performance scripts


Performance Test Result Analysis

This phase involves dedicated to the test engineers who design scenarios based on
identified workload and load the system with concurrent virtual users (VUsers).

Entry Criteria

Baselined Test scripts

Activities

Designing the scenarios

Loading the test script

Test script execution

Monitoring the execution

Collecting the logs

Exit Criteria

5.

Test script execution log files


Performance Test Result Analysis

The collected log files are analyzed and reviewed by the experienced test engineers.
Tuning recommendation will be given if any conflicts identified.

Entry Criteria

Collected log files

Activities

Create graphs and charts

Correlating various graphs and charts

Prepare detailed test report

Test report analysis and review

Tuning recommendation

Exit Criteria

Performance Analysis Report

6. Benchmark and Recommendations


This

is

the

last

phase

in

PTLC

which

involves

recommendation to the client.

Entry Criteria

Performance Analysis Report

Activities

Comparing result with earlier execution results

Comparing with the benchmark standards

Validate with the NFR

Prepare Test Report presentation

benchmarking

and

providing

Exit Criteria

Performance report reviewed and baselined

You might also like