0% found this document useful (0 votes)
11 views8 pages

Decamargo 2016

1) The document proposes an architecture to automate performance testing for microservices. 2) With microservices, applications are composed of many small services that can change rapidly, making testing difficult. Each service needs its own tests that are updated as the service changes. 3) The architecture allows performance tests to be executed automatically for each microservice. The microservices provide test specifications that are used to run the tests.

Uploaded by

zsoft
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views8 pages

Decamargo 2016

1) The document proposes an architecture to automate performance testing for microservices. 2) With microservices, applications are composed of many small services that can change rapidly, making testing difficult. Each service needs its own tests that are updated as the service changes. 3) The architecture allows performance tests to be executed automatically for each microservice. The microservices provide test specifications that are used to run the tests.

Uploaded by

zsoft
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

An Architecture to Automate

Performance Tests on Microservices


André de Camargo Ivan Salvadori
Graduate Program in Computer Science Graduate Program in Computer Science
Federal Univ. of Santa Catarina – Florianópolis, Brazil Federal University of Santa Catarina
IBM – São Paulo, Brazil Florianópolis, Brazil
[email protected] [email protected]

Ronaldo dos Santos Mello Frank Siqueira


Department of Informatics and Statistics Department of Informatics and Statistics
Federal University of Santa Catarina Federal University of Santa Catarina
Florianópolis, Brazil Florianópolis, Brazil
[email protected] [email protected]

ABSTRACT →Architectures →Distributed architectures • Software and its


The microservices architecture provides a new approach to develop engineering →Software organization and properties →Extra-
applications. As opposed to monolithic applications, in which the functional properties →Software performance
application comprises a single software artifact, an application
based on the microservices architecture is composed by a set of
Keywords
services, each one designed to perform a single and well-defined Microservices; test automation; performance test.
task. These services allow the development team to decouple
several parts of the application using different frameworks,
1. INTRODUCTION
The microservices architecture has grown in popularity in the past
languages and hardware for each part of the system. One of the
few years due to its capacity to overcome some of the major
drawbacks for adopting the microservices architecture to develop
problems found in monolithic applications [4]. The key concept
applications is testability. In a single application test boundaries can
regarding microservices is to develop an application as a set of
be more easily established and tend to be more stable as the
small services, with each service implementing a single feature,
application evolves, while with microservices we can have a set of
running in its own process, infrastructure and programming
hundreds of services that operate together and are prone to change
language.
more rapidly. Each one of these services needs to be tested and
updated as the service changes. In addition, the different The concept of microservices is based on a well-known principle
characteristics of these services such as languages, frameworks or adopted in software engineering. The idea of separating an
the used infrastructure have to be considered in the testing phase. application into more manageable components is present in several
Performance tests are applied to assure that a particular software approaches for software design, such as SOA (Service Oriented
complies with a set of non-functional requirements such as Architecture) [10, 11]. Although, we can observe that SOA tries to
throughput and response time. These metrics are important to deal with integration between applications. On the other hand, the
ensure that business constraints are respected and to help finding microservices architecture is a way of organizing a single
performance bottlenecks. In this paper, we present a new approach application. The main difference is in the granularity level, i.e.,
to allow the performance tests to be executed in an automated way, microservices are fine-grained services designed to provide
with each microservice providing a test specification that is used to specific unique features, while on SOA features are provided
perform tests. Along with the architecture, we also provide a through services kept under the same structure, deployed as a single
framework that implements some key concepts of this architecture. unit and running under the same infrastructure.
This framework is available as an open source project 1.
As applications have grown in size and complexity, the
CCS Concepts development time and cost have also increased [4]. Besides that,
• Software and its engineering→Software organization and we can notice some major difficulties to scale, synchronize
properties→Software system structures →Distributed systems changes, deliver new features and replace frameworks or libraries
organizing principles • Computer systems organization [12]. This leads to the search for a new way to mitigate these
problems. In this scenario, the microservices architecture arises as
Permission to make digital or hard copies of all or part of this work for a possible solution for overcoming these difficulties.
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that copies However, microservices are not the silver bullet for all problems
bear this notice and the full citation on the first page. To copy otherwise, found in monolithic applications [13]. Although this architecture
or republish, to post on servers or to redistribute to lists, requires prior can overcome some of the difficulties faced with monolithic
specific permission and/or a fee.
iiWAS '16, November 28-30, 2016, Singapore, Singapore
© 2016 ACM. ISBN 978-1-4503-4807-2/16/11$15.00
DOI: http://dx.doi.org/10.1145/3011141.3011179 1 https://github.com/asdcamargo/fpts
applications, it also ended up raising some challenges such as [2,
13, 15]: orchestration complexity in the deployment of multiples
services, managing and controlling the tests for each service,
maturity of the team to coordinate changes, to define the service
boundaries, to control shared libraries and code reuse.
Test challenges for adopting the microservices architecture include
the same ones found in distributed systems in general: inter-service Figure 1. Microservices Application Exposing the Same
communication, coordination between services and distributed Interfaces as the Monolithic Application [9].
transactions [2]. In fact, microservices bring another challenge,
which is to properly split the application in several components [4]. The remainder of this paper is organized as follows. In section 2 we
This means that we now have to establish the test boundaries of present the background on microservices and software testing,
each service, which will have a set of tests to evaluate its covering performance tests in more detail. Section 3 describes the
compliance with the existing requirements. Moreover, this set of related work and identifies gaps for improvement in the present
tests may involve other services, which imposes a challenge to track scenario. Section 4 introduces the architecture for automating the
the test dependencies within each microservice. performance tests of microservices. Section 5 presents a framework
that was implemented to show some features of the architecture.
We can quote the Gilt Group, which moved their online sales Section 6 shows an evaluation of how the framework affects the
platform from a monolithic architecture to a set of microservices. overall performance of an application. Finally, section 7 presents
After migration, they ended up with 156 microservices [8]. Testing the conclusions and perspectives for future work.
each of these services separately is a hard task. Moreover, to keep
each test specification up-to-date with the constant service changes 2. BACKGROUND
is also difficult. The same challenge also applies for performance
tests, each service may run in its own infrastructure with different 2.1 Microservices
frameworks and programming languages. By knowing each service The microservices architecture refers to an approach for developing
performance parameters, the proper infrastructure can be provided applications as a set of small loosely coupled services that uses a
to reduce costs. In addition, bottlenecks can be identified and lightweight network protocol to communicate with each other [4].
corrected to improve the performance of each server individually. The approach is targeted to the internal organization of an
Innovative approaches to improve tests on microservices are application not the application external boundaries such as its
demanded in this scenario. communication with other applications or services.

The work done so far in microservices tests includes the This approach allow breaking the application complexity into a set
development of a repository to centralize acceptance tests that use of services, so that a change can affect only one component and not
Behavior Driven Development (BDD) [1]. Another study was the whole application itself. Besides that, each service can be
made to evaluate the performance of the hardware when using provided considering its hardware constraints using only the
different types of containers to run microservices [11]. Regarding necessary resources and using the best technology for each purpose,
performance test in REST (Representational State Transfer) Web leading to cost reduction with development and infrastructure [12].
Services, [15] presents a framework to cover the different phases in Considering a web services based application, moving from a
the performance test execution. None of these works focuses on monolithic application to a microservices architecture will affect
performance tests in a microservices architecture. In addition, on only the internal structure of the application. The exposed services
[21] a study was made to evaluate the most recent contributions on will remain as they were in the monolithic application [9].
microservices, and 3 studies out of 23 focus on test improvements Therefore, external applications/APIs do not need to change after
for microservices. All three contributions deal with test automation, the migration from monolithic to microservices.
but none focuses on performance tests in particular.
Some key features of using a microservices architecture are [12]:
For the purpose of this work, we are considering end-to-end
performance tests. This means that a remote client sends the  Technology Heterogeneity: With the application composed as a set
requests over the network to the target service. of independent and loosely coupled services we have the freedom
to pick the best tool that fullfill the needs. That can be the full
The architecture proposed in this paper is designed to overcome the application stack, from the programming language to the
test challenges in microservices, more precisely performance tests. application server. Also, since services are small we can replace a
As concerns to performance tests, it is extremely important to technology with a low risk.
evaluate the performance that each service can deliver,
individually. This data will dictate the amount of load that a service  Resilience: The key concept of resilience is to isolate failures so it
can handle, the resources spent on infrastructure and will help to doesn’t affect the whole system. The natural independence of the
find bottlenecks and to control the impact of changes on the services in a microservices architecture allow to isolate key
performance. Our goal is to automate the execution of performance services in its own infrastructure to keep them working even with
failures in other services.
tests by attaching a specification that contains the test parameters
to each service. Besides the test specification, the test application  Scalability: Each service boundary is well defined and organized
must be configured with parameters such as the amount of threads around the business capabilities. This sort of granularity allow to
and requests, load time, test duration, and so on. We understand scale each one of the services as needed, reducing costs and
that these parameters can be easily defined and reused between providing only the necessary resources as oposed to monolithic
tests; however, the test specification tends to be unique for each applications where we have to scale the whole application.
operation of a service. That being said, in this paper we will focus
on the test specification and how it can be provided by the service  Ease of Deployment: In a microservices architecture we can
to the test application. change a single service and deploy it independently in a short time.
This allow the changes to be developed and tested faster and with with our work is the performance of the application instead of the
lower risk when compared to monolithic applications where even hardware.
small changes requires the whole application to be deployed.
Besides that, deployment problems can be easily isolated in a Kao et al. [15] present a mechanism to execute performance tests
microservices architecture allowing fast rollback and the service on RESTful web applications. In this particular paper, the focus is
restoration. on covering the whole process to execute performance tests: test
case design, test script generation and test execution. We claim that
2.2 Software Testing this approach is too focused on the tester side and requires some
Software testing is the process of executing a program with the knowledge of the test procedures. In addition, our goal is to allow
intent of finding errors [16]. Software testing is mainly divided into test automation, without human intervention. In this scenario the
two methods of tests: white-box and black-box [16]. Black-box main problem is to manage many test specifications (one for each
method consists of disregard the system internal behavior, instead, operation), things like workload and test duration can be easily
the test is made considering only the input and output data. On the customized in a test application such as JMeter [14].
other hand, White-box method consider the system internal
Our search for papers on the field of performance test for
structure causing each statement of the program to be executed at
microservices have shown that there is a gap for improvements in
least once. In software testing area there are different test levels or this area. None of the works found is directly related to
stages [19, 20]:
performance tests on microservices. Besides that, tests on
 Unit testing: The objective of this phase is to test each component microservices is described as one of the main challenges of this
or software unit individually and independently without architecture [2, 13].
considering other parts of the application.
Performance tests on microservices are even more relevant
 Integration testing: This level focus on testing if the components considering the amount of services that may compose an
work well after they are integrated. For this test, the application application, which can easily go up to hundreds of services [8]. We
components are bound together and assembled in a single artifact. have to consider that is necessary to know the performance of each
 System testing: This level deals with the end-to-end test. The service, individually. This way we can provide the necessary
objective of this test is to use the requirements that were raised resources for the infrastructure find bottlenecks in the application
during the analysis and check if all requirements are being and fulfill QoS requirements such as response time and throughput.
attended by the application. It can be hard to control each of these services tests. Each service
operation must be associated with a set of parameters to be used in
 Acceptance testing: At this level, final users perform the test in the performance test. Therefore, a mean to automate the generation
the system to check if the solution delivered meets their needs. of these parameters will allow the execution of the performance
In this context, performance tests are located in black-box method tests in a seamless and decoupled way.
and system testing level. In addition, performance tests applied to
web application are end-to-end tests, this means that the test 4. THE ARCHITECTURE
requests are send over the network to the target server. The architecture is designed to provide a clean and easy way to
evaluate the performance of microservices. Our solution is based
Considering performance testing the purpose of the test is to find on the HTTP specification and can be easily implemented by
out if the software complies with a set of non-functional developers.
requirements like throughput, response time and availability. In
fact, the term performance testing is a generic term that may refer The solution is divided in two concepts:
a set of different types of performance-related testing [17, 18] such  A specification for allowing external applications to access the
as load test, stress test and capacity test. test parameters that will be used to test the microservice. This
Therefore, using the term performance test we are actually referring specification consists in a set of contents with the necessary
to any test that measures the stability, performance, scalability and information to compose the requests that will be issued in tests.
throughput of a web application [18]. For the purpose of this work,  A mechanism to attach and expose the specification. This includes
we will use the term performance test once the present study can be how the specification is defined in the application side and how
used in any of its types of test. an external application can obtain this specification and further
use it to run the performance test.
3. RELATED WORK
Regarding acceptance tests, Rahman et al. [1] present an approach The key for this approach is to attach the specification to the
to be used for agile processes with BDD. This work tackles the microservice. This way we have each service operation with its own
specific problem of reuse of the steps in test scenarios. The specification, test boundaries are well defined across the
contribution of this work is an architecture to centralize the application and there is a mean to centralize all performance tests
acceptance tests of each service in one repository that can be shared within one application, i.e., the test application. Additionally, we
across multiple services. The focus of this work is acceptance tests ensure that the test specification is consistent with each new release.
when using BDD; however, we are focusing on performance tests. The architectural view of the solution is represented in Figure 2.
A study on microservice performance from the hardware We have chosen to provide the test specification as a response to a
perspective was presented by Amaral et al. [11]. The main goal of request that uses the HTTP Options method in a target URI. The
this paper was to analyze the difference, from the performance target URI for obtaining test specifications is the very same through
perspective, when using two types of containers to run which the service provides its resources/operations.
microservices: master-slave and nested-container. On this work, a The Options method is often used as a way for the system to provide
benchmark was used to measure CPU and network usage. The information about its resources [3]. The kind of information may
authors conclude that the nested-container is a more suitable model include available requests/responses, requirements and server
due to improved resource sharing. We claim that our main objective capabilities. Therefore, the proposed solution use a method that
Figure 4. Framework Behavior for HTTP Requests
the requested path, sample data to be used as request parameter, a
Figure 2. Architectural View of the Proposed Solution description that gives a semantic meaning for the method and
validation data that can be used to validate the response.
does not affect the regular ones already provided by the service,
such as POST and GET. This kind of separation will allow external Besides the parameter, the service will also provide a JSON
applications to get the test specification and work in an automated Schema [7], which will allow the customization of the test
manner. parameters.
The specification will provide to the test application (a.k.a. Test Let us say that the architecture specified by Figure 2 is implemented
Runner) a document that contains: all methods that are available in in some financial system and one of the microservices provides
operations over financial transactions. A financial transaction has
{ an ‘ID’, ‘Country’, ‘Name’, ‘Month’ and ‘Year’. Besides that, it
"POST":{ also contains a list of account details each one has an ‘Account’ and
"description":"Save a new financial transaction", two attributes for amount values: ‘AmountUsd’ and ‘AmountPln’.
"parameter":"{ This particular microservice has two operations: find a transaction
"name":"000023989",
"country":"766", by its ID, and save a new transaction. Both operations can be
"acctMonth":"10", requested by sending, respectively, GET and POST HTTP requests
"acctYear":"2015", to the following URI: /rest/finance.
"accountLines": [{ As this microservice is designed to allow automating performance
"amountUsd":1000, tests, a test application can get the test specification to perform such
"amountPlan":1000.0, test by sending an OPTIONS request to ‘/rest/finance’. For this
"account":"0083289"}]}",
particular request, we have a response such as the one represented
"parameter-schema":"{
by Figure 3.
"type":"object",
"id":"urn:jsonschema:com:finance:company:journal:bean: With this proposal, we want to be able to provide a clear and simple
FinancialTransaction", specification to the test application. Moreover, we want the test
"properties":{ application to find enough subsidies to execute performance tests
"id":{"type":"string"}, on the target URI for all operations that the service accepts in this
"name":{"type":"string"}, URI. This includes the parameter to use in the request and rules to
"country":{"type":"string"}, validate the response. These validations allow the execution of
"acctMonth":{"type":"string"}, some functional tests along with the performance tests.
"acctYear":{"type":"string},
Figure 3. Example of a Response for an OPTION Request
"accountLines":{"type":"array","items":{"type":"object", 5. FRAMEWORK
"id":"urn:jsonschema:com:finance:bean:AccountDetails", To validate the proposed architecture, we have developed an
"properties":{"id":{"type":"string"},
implementation of a framework for performance test specification
"amountUsd":{"type":"number"},
(a.k.a. FPTS) in Java using the Spring Boot framework [6]. Spring
"amountPlan":{"type":"number"},
"account":{"type":"string"}}}}}}", Boot offers an embedded Tomcat or Jetty server along with Web,
"validation":"{"headerData":{"status":"200"}}" MVC and Persistence frameworks, all set up with minimal
}, configuration overhead, which makes it appropriate for hosting and
"GET":{ developing microservices [5,6].
"description":"Get transaction by id", The key concept for our framework operation is to work as a filter
"parameter":"{"parameter":"id","value":"1"}", for HTTP requests to the microservice. Therefore, our framework
"validation":"{"bodyData":{"name":"000023989", will check each request that is sent to the service, and in case of an
"country":"766","acctMonth":"10","acctYear":"2015"}, OPTIONS request the framework generates the specification
"headerData":{"status":"200"}}"
present in the service and returns the full test specification for the
}
requested URI. As the requested URI may allow different HTTP
}
methods, the result response can contain more than one test
Figure 3. Example of a Response for an OPTION Request specification (see Figure 3 as an example).
As one can see, the framework works as a layer before the service @PerformanceTest(path = "/rest/finance",
itself. By filtering each messages that comes to the service the httpMethod = HttpMethodEnum.POST,
framework is capable to respond to OPTIONS requests with the test description = "Save a new financial
specification and for all other requests the framework will forward transaction")
the request to the service. This mean that the regular operations that
public TestSpec<FinancialTransaction>
the service provide will not be affected by the framework behavior.
getTestSpecForSave() throws IOException
5.1 Implementation {
We focused our implementation in facilitating the usage and at the FinancialTransaction testObj =
same time, we also want to keep the framework as transparent as new FinancialTransaction("000023989",
possible allowing the developer the option to include it or remove "766", "10", "2015");
it during the build process of the service. With that in mind, we
AccountDetails accDetails =
used Java annotations to give the developer a clean and easy option
to build the methods that will return the test specifications. On the AccountDetails.build("0083289", 1000d,
other hand, we used the build engine Maven to split the framework 1000d);
into two artifacts: one for the API and one with the implementation. testObj.setAccLines(accDetails);
TestValidationsBuilder validationBuilder =
In the API we placed the annotations and the classes to build the
specification itself. The implementation artifact contains the filter new TestValidationsBuilder();
that will handle the requests to the service as well as main classes TestSpec<FinancialTransaction> testSpec =
that handles the process of building the test specification based on new TestSpec<FinancialTransaction>
the annotation properties and the object returned by the annotated (testObj, validationBuilder.
method. This separation allows the developer to compose build buildHeaderStatus200AndEntityBody(
directives and choose to include the framework implementation or testObj));
not without any code change needed. During runtime, the API will return testSpec;
look for the provider class that is present in the implementation
}
artifact, if the implementation is not found the framework
capabilities will be turned off and the service will operate without Figure 5. Framework Usage to Expose a Specification
the framework.
validating only the header status parameter to check if its value is
To keep the framework as independent as possible, we used 200 (OK).
reflection to extract the specification from the service and compose
the response. This way, the framework keeps no direct reference or The annotation holds the path in which the service provide its
dependency to the service. This also allows the framework to work operation, a description and the HTTP method used in this
in a seamless way without interfering in other requests. This operation. This data along with the ‘TestSpec’ object will compose
implementation also makes possible the removal of the framework the full specification (Figure 3). In the case where the service
without the necessity of a code change which may be useful when provides more than one operation in the same path, the framework
deploying the application to production or other controlled will generate the specification for each operation and append them
environments. in one response (Figure 3).
Please note that we placed the attributes values directly on the code
5.2 Usage to show the link between the attributes values and the generated
From the developer viewpoint, the framework allows microservices
specification that is represented in Figure 3 as the POST object. In
developers to expose test specifications with minimum overhead, order to enhance the test capabilities it is recommended to use
by using Java annotations in methods that will return the dynamic values that can be obtained from a database.
specification. The first thing to do to start using the annotations and
the framework classes is to add the API library to the service. After 5.3 Key Features
that, the developer has to implement the specification methods, one Our goal with this framework is to provide a clean and easy way to
specification method for each service operation that will be exposed expose the specification for the performance test. The result
for performance testing. The final step is to add the framework filter achieved with this framework was very promising; we were able to
to the list of web filters; this will make sure that the framework will add some useful features such as:
be called on each request to the microservice.
 Framework API: This allows the developer to remove the
Let us consider the same example that was described in the previous framework from the application during the build. This is
section (see Figure 2). Now, we want to expose a test specification particularly useful when we are building the project for production
for the save method. From our example, this method is accessed deployment as we may not want to expose the test methods.
sending a POST request to the ‘/rest/finance’ endpoint. To achieve
this we can create a new method in the service class and annotate it  Annotation capabilities: This provide a clean and simple way for
with the @PerformanceTest annotation, as shown in Figure 5. the developer to define the specifications. In addition, this facility
allows the developer to choose which service methods will be
The Java annotation is the mean for the framework to find out exposed.
which methods will be exposed for performance testing and also
the specification object (TestSpec) that will be used by the test  Dynamic data for testing: This refers to the possibility of using the
application. The object returned by the method invocation contains database to fetch data for the test specification and validation. With
the parameters that will be used for service invocation. In the this, the developer can bypass the changes that the data may have
which could lead to failure in the tests. Let us say that we want to
provided sample, the return will be a JSON object of type
test a GET operation and we will get some data by its ID.
‘FinancialTransaction’. Besides the parameter, the ‘TestSpec’ will
Therefore, we can query the database and get the object with ID =
also store the validation data. In the given example, we are 1 and provide the same object for validation in the specification.
The final result for the necessary sample size was approximate to
10,000. We divided the requests (samples) into 25 threads sending
400 requests each, resulting in 10,000 requests. In addition, to
overcome undesired variations such as network problems we ran
two rounds of tests for each application and for comparison we took
the mean of those two rounds. To define the test order, we ran a
random trial with the two possible values: with the framework
(WF) and without the framework (NF). The final order was WF,
NF, NF, WF.

6.2 Configuration and Execution


We deployed both applications under Amazon EC2 using an
instance of type t1 micro (1 vCPU, 3.75GB memory and SSD
storage). Each test round was executed with only one of the
Figure 6. Compile Error for Invalid Return Type applications running each time. Besides that, the virtual machine
Although this object may change, the framework will always build was dedicated to run the application, no other process besides the
the updated data and the test application will have the refreshed ones from the operational system were executing.
data to use. For this evaluation, we have focused on two parameters: throughput
 Annotation processor: This feature checks if the return type for the and response time. The performance tests were executed using the
annotated method is valid. This check happens during compile JMeter tool [14].
time, which prevents errors in runtime and allows the developer to The test procedure consisted of a POST request using a
fix these inconsistencies quickly. Figure 4 presents a correct ‘FinancialTransaction’ object. The object is the very same as the
method. If the returning class of the annotated method has a one presented in Figure 5 and identified as the ‘testObj’ variable.
different type or if it does not have the generic parameter, the On the microservice side, the object is inserted into an in-memory
compiler will throw an error. If we set the returning type as database to avoid bottlenecks on the service side, the same object
‘TestSpec’ without the generic class, the error presented in Figure with the ID attribute (auto generated by the service) is returned as
6 is thrown. response.
 Build features for validation data: common validation types have
been created in order to append the validation data to the
6.3 Results
The results for the tests with the framework are presented in Table
specification, such as: validate only header status and validate for
1 in the order that they were performed. Table 2 contains the mean
both header and body. This provides a quick way for the
for each case (WF and NF).
developer to define the validations rules and add some functional
test statements to make the performance tests more reliable. Based on the results, we cannot assert that the framework caused
any impact on the application’s performance. As we can see, the
6. EVALUATION average response time mean was less than the one found in the
As part of the evaluation process, we set up a financial application, application without the framework. From the implementation these
in two scenarios: with the framework and without the framework. results were expected because for requests that does not use
Our goal with this evaluation is to check the impact of the OPTIONS we only forward the request to the service, no processing
framework usage in the application’s performance since the is made in the framework side other than the HTTP method check.
framework captures all requests even the ones that are not
OPTIONS requests.
By doing the evaluation, we want to check if the framework will
affect in the service performance and potentially influence the test
performance results.

6.1 Methodology
For this evaluation, we will consider the number of samples as the
total amount of requests. To discover the ideal number of samples
to use we performed a pilot test to obtain the standard deviation
mean, for this particular test, we used the application with the
framework and this was decided in a random trial. The pilot test
load configuration was 20 threads with 50 requests each, resulting
in a total of 1,000 requests and the result of this test is shown in
Figure 7.
For the response time the result was an average of 226ms and the
deviation was 275 for this pilot test. Also, we used a confidence
level of 99% which gave us a z value of 1.96 and a margin of error
of 0.5%. With these values, we applied the following formula to
discover the sample size needed: Figure 7. – Results for the Pilot Test
Necessary Sample Size = Z * StdDev__
(margin of error)
Table 1 – Test results for the four rounds Overall, the result of this study is promising. The proposed
architecture is simple and effective, providing a clean and easy way
Test Results for each round
Test to automate the performance test on microservices. The developed
round Avg. Throughput (per
Avg. Response Time (seconds)
second) framework is easy to use and integrate, and does not have impact
50.64 on the performance of the service operations which allows it to be
1 – WF 117
used in any environment and to collect realistic performance
2 – NF 122 48.66 parameters.
3 - NF 119 49.17 As a next step, we intend to evolve the framework, providing a
client side API to facilitate its integration with well-known
4 - WF 119 50.07 performance testing tools such as JMeter. With this improvement,
we can also develop a central repository to manage all performance
Table 2 - Throughput and response time mean for each case tests in a single end. This repository can come with a web module
that provides test runs in a quick manner. We can also provide a
Mean for the values obtained after the rounds historical database to store test results. This would be interesting
Test
round Response Time Mean Throughput Mean (per for analyzing the impact of changes in the lifetime of applications
(seconds) second) with large sets of microservices.
1 – WF 118 50.35
A good usage for the framework is to append a test execution
2 - NF 120.5 48.91 library to the service. This would allow the execution of
performance tests on the infrastructure where the service is
executed. To achieve this, docker containers could be used to
Additionally, the absence of impact in the service performance
provide lightweight clients to send the test requests. The main
allow it to be deployed in any environment with the framework
advantage of this approach is to give a realistic performance result
appended to it. The framework could be used to track down
that will only depend on the service implementation and the
performance issues in production environment for example. In
infrastructure removing the network that may cause disruptions on
addition, the service provider could use the framework to provide a
the performance values.
mean for the clients to perform the test and ensure that they are
getting the minimum QoS parameters that they hired. This can be Another improvement can be to increase the capabilities for
used to give a guarantee and reliability to the service clients as they functional tests. With this, we can provide a large set of validations
can attest by themselves with minimal effort using the specification that will allow the execution of functional tests using the same
provided by the service. approach that we have adopted for performance tests.
Overall, the evaluation results are promising and show that the Finally, we also intend to implement the framework using other
framework can be used without restrictions or side effects. languages, such as Go and Python. This is important because one
of the premises of a microservices architecture is to be polyglot in
7. CONCLUSION terms of programming languages.
Microservices emerged as an alternative to develop applications
and overcome some of the challenges found in traditional 8. REFERENCES
monolithic applications. The decoupled nature of the services give [1] Rahman, Mazedur, and Jerry Gao. A Reusable Automated
more flexibility on the deployment, development, scalability and Acceptance Testing Architecture for Microservices in
choice of technologies. This flexibility ends up increasing the Behavior-Driven Development. Service-Oriented System
complexity of some development activities such as testing. Engineering (SOSE), 2015 IEEE Symposium on. IEEE, 2015.
Focusing on performance tests in applications based on the
[2] Namiot, Dmitry and Sneps-Sneppe, Manfred. On Micro-
microservices architecture it is noticeable that for a large set of services Architecture. International Journal of Open
services (e.g. 100) the manual control of each test specification Information Technologies, v. 2, n. 9, p. 24-27, 2014.
(generally, will be one for each service operation) becomes a hard
task. [3] Http Options Method. Last access: 2016-03-02. Available:
https://www.w3.org/Protocols/rfc2616/rfc2616-sec9.html.
In order to simplify the execution of tests, we have presented in this
paper an architectural model to allow automating the execution of [4] Fowler, Martin, LEWIS, James. Last access: 2016-03-15.
performance tests in applications that are based on the microservices. Available: http://martinfowler.com/articles/
microservices architecture. microservices.html.
Our approach was developed to be easily integrated, demanding [5] Building a Microservice Architecture with Spring Boot and
only small changes in the application. The main feature of our Docker, part I. Last access: 2016-03-25. Available:
proposal is to embed the test specification in each service. This way http://www.3pillarglobal.com/insights/building-a-micro
we enforce that it will be consistent with the service service-architecture-with-spring-boot-and-docker-part-i.
implementation despite any changes that the service may have [6] Microservices with Spring. Last access: 2016-04-25.
suffered. We also employed the OPTIONS HTTP method to expose Available: https://spring.io/blog/2015/07/14/microservices-
the specification and allow that external testing applications are with-spring.
able to access it at runtime.
[7] JSON-Schema Specification. Last access: 2016-04-04.
Along with the architecture, we have developed a framework that Available: http://tools.ietf.org/html/draft-zyp-json-schema-
implements some of the planned features. We tested the framework 04.
in an evaluation study. The evaluation shows that the framework
has no impact on service execution. [8] Scaling microservices at Gilt. Last access: 2016-04-12.
Available: http://www.infoq.com/news/2015/04/scaling-
microservices-gilt.
[9] Clark, Kim. Microservices, SOA, and APIs: Friends or [16] Myers, Glenford J., Corey Sandler, and Tom Badgett. The art
enemies?. Last access: 2016-05-15. Available: of software testing. John Wiley & Sons, 2011.
http://www.ibm.com/developerworks/websphere/library/tech [17] Types of Performance Testing. Last access: 2016-03-23.
articles/1601_clark-trs/1601_clark.html?ca=drs- Available: https://msdn.microsoft.com/en-us/library/
&cm_mmc=dw-_-trs-_-social-_-generic. bb924357.aspx
[10] What are microservices?. Last access: 2016-03-19. Available: [18] 7 Types of Web Performance Tests and How They Fit Into
https://opensource.com/resources/what-are-microservices. Your Testing Cycle. Last access: 2016-03-23. Available:
[11] Amaral, Marcelo, et al. "Performance Evaluation of http://blog.smartbear.com/software-quality/7-types-of-web-
microservices Architectures using Containers."Network performance-tests-and-how-they-fit-into-your-testing-cycle/
Computing and Applications (NCA), 2015 IEEE 14th [19] Thakare, Sheetal, Savita Chavan, and P. M. Chawan.
International Symposium on. IEEE, 2015. "Software Testing Strategies and Techniques." International
[12] Newman, Sam. Building microservices. " O'Reilly Media, Journal of Emerging Technology and Advanced
Inc.", 2015. Engineering (2012).
[13] Experience from failing with microservices. Last access: [20] Sathawornwichit, Chaiwat, and Shigeru Hosono. "Consistent
2016-05-08. Available: http://www.infoq.com/news/2014/ Testing framework for Continuous Service
08/failing-microservices. Development." Proceedings of the 2014 Asia-Pacific Services
[14] Apache JMeter. Last access: 2016-03-19. Available : Computing Conference. IEEE Computer Society, 2014.
http://jmeter.apache.org/. [21] Pahl, Claus, and Pooyan Jamshidi. "Microservices: A
[15] Kao, Chia Hung, Chun Cheng Lin, and Juei-Nan Chen. Systematic Mapping Study."
"Performance testing framework for rest-based web
applications." Quality Software (QSIC), 2013 13th
International Conference on. IEEE, 2013.

You might also like