Performance Testing AJAX-based Applications
Performance Testing AJAX-based Applications
Performance Testing AJAX-based Applications
Testing for
AJAX-based
Applications
Rajendra Gokhale,
Aztecsoft itest
ABSTRACT ............................................................................................ 3
INTRODUCTION ........................................................................................ 4
AJAX APPLICATION VS NORMAL WEB APPLICATION ........................................................ 4
GOOGLE SUGGEST – A CASE STUDY ..................................................................... 7
CHALLENGES IN PERFORM ANCE TESTING AJAX APPLICATIONS .............................................. 8
Definition of Performance Goals and Metrics ................................................................... 8
User Modeling ....................................................................................................... 9
Scripting and Load Simulation ...................................................................................10
CONCLUSION ......................................................................................... 11
REFERENCES ......................................................................................... 11
The AJAX model of development for Web applications has rapidly gained a lot of popularity because of
its promise of bringing the richness and responsiveness of desktop applications to the web. AJAX
implementations are fundamentally different from other web implementations in two respects - they
make asynchronous requests for parts of the web page. Techniques routinely used for performance
testing of traditional web applications need to be modified and enhanced to suit the needs of AJAX-
based applications. Using Google's "Google Select" service as a case study we examine the unique
challenges of carrying out performance testing of AJAX-based applications and offer suggestions for
overcoming them.
Considering that one of the key drivers for the rapid adop tion of AJAX has been its promise of superior
performance, it is surprising that there has not been much discussion of AJAX-specific performance
testing. When we studied this in some detail, we found that AJAX applications indeed present some
unique issues and challenges, which we discuss in this paper.
In contrast to this, Ajax applications make a number of asynchronous web requests for parts of the
current webpage. These requests are issued by a piece of client-side code that is executed in the
browser context. This client-side code is usually implemented in JavaScript and is called the AJAX
engine.
The following table summarizes the key differences between traditional and AJAX web applications:
Done right, the AJAX approach can yield a number of important advantages:
Since the response does not contain the entire page, a smaller amount of data gets transferred
across the network thus resulting in better network utilization.
Instead of reloading an entire page, AJA X applications update only parts of the page, thus
improving the responsiveness of the application.
1
DOM : Document Object Model. See www.w3.org/DOM .
It is therefore imperative that an AJAX application be put through a thorough performance testing
cycle before it is released for general use. We have selected “Google Suggest” as a real-world
example so that our discussions retain a practical flavor.
“Google Suggest Dissected” [4] contains a detailed discussion of how this functionality is implemented
but we list below some key implementation details that are most relevant for our purpose, as we will
see in Section ____.:
Asynchronous requests for phrase completions are sent to the server at regular time intervals.
The time interval between these requests is determined dynamically and is a function of the
latency observed by the specific client.
No request is sent if the user has not typed anything since the last request was sent.
The results obtained dynamically are cached. This comes in useful when the user erases
something he has typed, since the cached results can be reused thus saving some unnecessary
requests to the server.
The goals, and therefore the metrics, for the performance testing of AJA X applications are not the
same as for traditional applications. T wo of the most widely used traditional measures for web
applications, “page views per unit time” and “clicks per minute”, are meaning less in an AJAX context.
A user could theoretically be looking at the same page for hours on end without ever clicking any URL,
submitting any form or refreshing the page. A lthough the user may have viewed just one page, he
might have generated tens of thousands of requests and may actually have been g lued to the screen
during this entire period! One example of such an application might be a dashboard for monitoring a
chemical plant. The dashboard could get updated at regular intervals without generating even a single
page view over an extended period of time.
On the other hand, some performance goals (and metrics) that are relevant for AJA X applications are
largely or completely inapplicable for non-Ajax applications.
Optimization of the AJAX engine: While traditional applications need to make sure that all
components at the server are properly designed, tuned and configured, in the AJA X application
the AJAX engine acts like an intermediate client-side server. When a Goog le Select user, for
example, enters a string in the search-box, the AJAX engine pre-fetches a certain number of
words and phrases that are consistent with the text that the user has already entered. This
allows the user to select a search item from the options suggested by Goog le, thus saving
typing effort and enhancing the user experience. Great idea, but it introduces a few problems.
The AJAX engine could make too many requests , and choke up the network; and/or the we b-
server; or it could make too few requests and lag behind the user (especially if the user is a
fast typist). It would therefore seem reasonable to vary the frequency of AJAX requests in
“Google Suggest” as a function of the network speed and the user’s typing abilities, in other
words to optimize the AJAX engine. It must be noted that, while there are numerous
dimensions that optimization may address (e.g. network utilization, number of computations
on the client or server, responsiveness of application, etc.), whatever criteria are chosen,
optimization of the A JAX engine is an important goal for any performance testing efforts for
AJAX applications. In the context of “Google Suggest”, the performance tester could ask:
“What is the function that currently determines the frequency at which the
AJAX engine makes an AJAX request, w here the two inputs are (1) observed
server-response times and (2) observed user typing speeds?”.
For the sake of optimization, the next question would be how to optimize that function. The ans wer of
course would depend on the application.
Clearly, one needs to a factor in many more variables when modeling the load on an AJAX application,
which makes the process of load-modeling much more complex. We feel that one important element
of performance test planning is the method for simulating this behavior. Should one adopt a simplistic
approach and generate (say) an AJA X request at statically determined intervals or should one go all the
way and try to accurately model a real-life scenario? This is not an easy question, depending as it does
upon the nature of the application. This question also crops up in performance testing for normal
applications but is much more acute here.
E.g.:
10 % Fast (x words/sec)
70 % Medium (y words/sec)
30 % Slow (z words/sec)
2
Response time distribution.
2 The Ajax engine dynamically determines the frequency of intermediate requests based on its observed response
times. When response times are higher, it lowers the frequency of responses, thereby reducing the overall number
of requests sent in a given duration. It is therefore important to simulate this response time distribution
accurately. Note that this is different from the Network Topology modeling carried out during routine
performance testing.
We attempted to gauge the degree of difficulty involved in creating test scripts for AJAX applications
by actually creating performance testing scripts for both versions of the Goog le search engine using the
CONCLUSION
Our overall conclusion is that, although performance testing for AJAX applications is significantly more
challenging than that for traditional applications, it is certainly practical to attempt. For this to be
successful there is a need for identifying the special challenges involved and incorporating solutions in
a well-designed methodology. We hope that some of the challenges highlighted in this paper may help
in these efforts.
REFERENCES
1. http://www.adaptivepath.com/publications/essays/archives/000385.php. The original article
by Jesse James Garrett that popularized the term AJA X.
2. http://www.developer.com/java/other/article.php/3554271. Good discussion of the benefits
of using AJAX, including the business case.
3. http://www.onlamp.com/pub/a/onlamp/2005/06/09/rails_ajax.html. Describes “Ruby on
Rails”, a framework popular among AJA X developers.
4. http://serversideguy.blogspot.com/2004/12/google-suggest-dissected.html. “Goog le Suggest
Dissected” - a good discussion on how “Goog le Select” is implemented (at the browser side).
5. http://www.baekdal.com/articles/Usability/XMLHttpRequest-guidelines - Thomas Baekdal’s
usability guidelines for XMLHttpRequest.
6. http://www.webperformanceinc.com/library/reports/AjaxBandwidth/index.html - “Using
AJAX to improve the Bandwidth Performance of Web Applications”, by Christopher L Merrill of
Web Performance, Inc., published January 15, 2006..
7. http://www.techweb.com/wire/showArticle.jhtml?articleID=165702733
8. http://hinchcliffe.org/archive/2005/08/18/1675.aspx
9. http://jakarta.apache.org/jmeter. The official site for the load testing tool “Jakarta Jmeter”.