AutomatedSoftwareTestingMagazine July2012
AutomatedSoftwareTestingMagazine July2012
com
A utomated . . . . . . . S T
oftware esting MAGAZINE
July 2012 $8.95
SQL
Register Now!
OctOber 15 - 17, 2012 BWI AIrport MArrIott Linthicum, mD
Tutorials Concurrent Presentations TABOK Certification Training & Exam Discussion Forums & Networking 4th ATI Automation Honors Keynotes
www.testkitconference.com
A utomated S T
oftware esting
July 2012, Volume 4, Issue 2
Contents
Flaws & Technology
By understanding current shifts and trends in technology, one can make reasonable assumptions about where testing and test automation is headed, and can thus chart a course to move in that direction. It is important, however, to move into the future without all of the same mistakes from the past. This issue focuses on moving into the realm of new technology without all of the same old flaws in our approaches.
Features
If repeating the same action while expecting different results is the definition of insanity, then automators are often insane! Read this article for help escaping the crazy cycle by avoiding common automation flaws. By Clinton Sprauve
Mobile offers an array of unique challenges for testing, and in addition, changes test automation from a nice-to-have to a must-have. Read this article to learn how to implement a winning mobile test automation strategy. By Yoram Mizrachi
I BLog To U 34
Go On A Retweet 36
Impenetrable Systems Learn how to automate a simple penetration test in 3 steps. Cloud Services Wrapped Around Open Source A look at some up-and-coming Cloud services with a reliance on open source test tools.
July 2012
Open Sourcery
10
www.automatedtestinginstitute.com
Editorial
Mobile, Virtualization and The Cloud... The rate of speed with which these words are becoming clich is only being outpaced by the rate at which the technology represented by these words is rising in significance
technological shifts discussed were Mobile, Virtualization and The Cloud. There is just no escaping these popular buzz words. The rate of speed with which these words are becoming clich is only being outpaced by the rate at which the technology represented by these words is rising in significance. You are therefore advised to learn as much about these technologies as possible and prepare yourself as testing and test automation become increasingly intertwined with them. This issue of the AST Magazine is dedicated to aiding you in your pursuit of knowledge in the aforementioned areas. The first feature entitled www.automatedtestinginstitute.com Monitoring a cloud-based AJAX web application using Selenium and Nagios by Viktor Doulepov describes one teams experience with integrating Selenium into a Cloud-based production monitoring framework based on Nagios. Next, Addressing The Flaws in Test Automation, is a featured article by Clinton Sprauve that discusses common flaws that repeatedly plague automation regardless of the technology involved. Understanding these issues will help us move into the future without all of the mistakes of the past. Finally, the Planning a Mobile Test Automation Strategy That Works article by Yoram Mizrachi tackles the test automation mobile challenge head on. July 2012
The indicator that probably received the most focus was the one that addressed technological shifts. Before actually discussing current shifts and trends I invoked the Volume 1, Issue 2 article by Linda Hayes entitled The Evolution of Automated Software Testing in order to reveal how past technology trends provoked responses from the test automation discipline. I then discussed current technological trends and revealed evidence for what we can expect as a response from test automation. Even if you werent at the presentation, I bet you could guess that the primary 4 Automated Software Testing Magazine
w w w. a u tom atedtestinginstitute.com
Help provide comprehensive, yet readily available resources that will aid people in becoming more knowledgeable and equipped to handle tasks related to testing and test automation
Offer training and events for participation by people in specific areas around the world
ATIs Local Chapter Program is established to help better facilitate the grassroots, global discussion around test automation. In addition, the chapter program seeks to provide a local based from which the needs of automation practitioners may be met.
A utomated S T
oftware esting
Managing Editor Dion Johnson Contributing Editors Donna Vance Edward Torrie Director of Marketing and Events Christine Johnson
A PUBLICATION OF THE AUTOMATED TESTING INSTITUTE
The Automated Software Testing (AST) Magazine is an Automated Testing Institute (ATI) publication. For more information regarding the magazine visit http://www.astmagazine.automatedtestinginstitute.com
www.automatedtestinginstitute.com
July 2012
July 2012
www.automatedtestinginstitute.com
TestKIT Tip
Impenetrable Systems
Automate a Penetration Test in 3 Simple Steps
TestKIT
SQL
ecurity concerns are paramount to software quality considerations these days, which has given rise to increased attention to security testing. One of the most common types of security testing is penetration testing, which is defined as a test method that simulates an attack on a computer system or application in order to identify vulnerabilities and how those vulnerabilities may be exploited. There are several certification programs that focus on information and systems security including the CISSP (Certified Information Systems Security Professional) and the SANS GIAC (Global Information Assurance Certification) Certified Penetration Tester. In addition, there are several commercial and open source tools that may be employed to aid in penetration testing, including tools known as Metasploit, Wireshark and BackTrack. Testers who are heavily engaged in security testing typically use some specialized security testing tool, but it is also possible to use a functional 8 Automated Software Testing Magazine
is the name used by ATI for describing ones testing toolkit. A TestKIT is filled with knowledge, information and tools that go with us wherever we go, allowing our projects and organizations to quickly reap the benefits of the practical elements that weve amassed. This section provides tips to add to your TestKIT.
1. Construct SQL Fragment 2. Add to Field and Submit 3. Verify No Inappropriate System Access or Data Granted
July 2012
TestKIT Tip
So entering Michael_Jerome into the Username field and h_pass into the Password field would result in the construction of the following query:
SELECT userID FROM user_accounts WHERE userField = Michael_Jerome AND passField = h_pass
vulnerable to SQL injection attacks will dynamically construct and submit the following query to the database:
SELECT userID FROM user_accounts WHERE userField = anytext OR a=a -- AND passField = h_pass
As long as a userID is returned by the query, the user is validated and allowed into the application. Otherwise, a login error is presented. If the application is not designed to handle special character inputs special character inputs including single quotes (), dashes (-), etc a user is left with an open door to accessing the application without a valid login, by dynamically manipulating the backend query. Manipulation may occur by entering a SQL fragment similar to the following: anytext OR a=a -The fragment may need to change based on the type of database being used by the application. For example, the -- characters represent what is known as a comment in some databases. Others may represent a comment using /*.
Note that the constructed query has been dynamically changed from its original form, to become a query that will always return data from the database - even without a valid username and password. The WHERE clause forces the query to always return the first record in the user_ accounts table because, although there is probably no anytext value in the userField of the table, a = a is always true; the WHERE clause only requires one of the two conditions joined by OR to be true. What about the AND clause, you ask? Since comment characters (--) were introduced, the portion of the query following the comment characters with the AND clause that checks for the password is ignored and thus rendered irrelevant.
C ontent C oday T
Community Comments Box
ontribute
Automation Events
As a registered user you can submit content directly to the site, providing you with content control and the ability to network with like-minded individuals.
Open Sourcery
Everyone seems to have their heads in the Cloud these days, and open source is no exception. There have been many interesting things going on with respect to services built around open source test tools, so read on as ATI provides a peek into some of these occurrences. BlazeMeter BlazeMeter is a provider of a self-service, load testing Apache JMeter cloud. As a reader of the Automated Software Testing Magazine, youre probably also a follower of the ATI Automation Honors, and are well aware of JMeters dominance in the Best Open Source Performance Test Tool categories over the years. BlazeMeter has sought to capitalize on the communitys demand for JMeter through a cloud-based service aimed at simplifying the deployment and increasing the scalability of the tool. JMeter is an excellent automation tool and has already had more than a million downloads this year, but it is challenging to deploy and is often limited in terms of scalability for the requirements of enterprise and high-traffic websites, says Girmonsky(1). BlazeMeter is fairly new, but has already been pretty busy making new announcements such as the release of a module for quickly launching high volume load tests against Drupal websites, and the deployment of services to efficiently load test complex, rich Facebook applications.
Netflix
Once known for its postal service-based offering that centered around snail mailing DVDs to subscriber mail boxes, Netflix is now becoming more known for its cloudbased, on-demand, streaming media service. Given the Netflix business models heavy reliance on service availability and reliability, it was incumbent upon them to produce some way to ensure these quality attributes were rated highly. Automated tools to the rescue! Beginning with a tool called Chaos Monkey, Netflix has created an army of monkey tools that theyve dubbed their References simian army. Chaos Monkey is a tool 1. http://www.networkcomputing. that randomly disables virtual machines to com/end-to-end-apm-techensure the system as a whole can continue center/232300034 with no customer impact. In addition, there is a Latency Monkey, Conformity Monkey, 2. http://www.msnbc.msn.com/ Doctor Monkey, Janitor Monkey, Security id/48197228/ns/business-press_ Monkey and a 10-18 Monkey. releases/t/blazemeter-releases-holygrail-cloud-testing-open-sourceThese monkeys have been used to great drupal/#.UAbsslIR4W0 effect by Netflix, with the promise that there 3. http://www.marketwire.com/presswill be more to come. But more to come for release/blazemeter-launches-selfwho? Apparently, theres more to come for service-performance-load-testing-forthe world, given that Netflix now plans to facebook-applications-1662528.htm open source these tools over the next few 4. http://techblog.netflix.com/2011/07/ months. According to Adrian Cockcroft, the netflix-simian-army.html Director of Cloud Architecture at Netflix, they plan on releasing pretty much all of our 5. http://www.wired.com/ wiredenterprise/2012/04/netflix_ platform, including the Monkey infrastructure, monkeys/ over the rest of this year. (5)
www.automatedtestinginstitute.com
July 2012
4 Annual
th
www.automatedtestinginstitute.com
July 2012
Using Selenium and Nagios, we have built a stable, smallfootprint and low-cost solution for monitoring the availability and basic functionality of an AJAX web application.
July 2012
www.automatedtestinginstitute.com
13
Background
Monitoring Basics
Typical monitoring solutions allow you to observe what is happening with the devices and hosts on your network. They also provide you with early warnings for critical parameters such as memory consumption, CPU utilization, free disk space, availability of specific ports on the nodes and their response times (e.g., pinging the nodes in question). Most of the available solutions allow you to perform SNMP checks over the managed devices within your network. However, out-of-the-box setups usually focus around low-level checks. Therefore, if you need to verify that the records in your production database are up-to-date or that your running application allows users to login and view their reports, you might spend noticeable efforts on customization.
You need to be sure your application is available to users worldwide, and not only within its cloud subnet.
First, you are most likely to pay for resources (i.e., memory/disk space/ CPU) by volume, so you should thoroughly assess performance and the footprint of your solution. In the very beginning of your journey, you probably would not want a dedicated monitoring node, but rather would be satisfied with the service running on one of production nodes. Second, you should take into account a modular approach. Depending on available horsepower, you can either keep all of your tools on a single node, or spread them throughout your network. For example, the head of your monitoring solution can be on one node, the test drivers on another node, the test executors on yet another node and so on. The modular approach is beneficial for situations where you want to evenly spread load amongst your nodes. Last but not least, you should try to keep your tests external to your solution. You need to be sure your application is available to users worldwide, and not only within its cloud subnet.
AJAX
The advent of AJAX applications introduced another issue: simple tools like wget, curl or even JMeter became inadequate for verifying web applications. A test run is no longer a mere sequence of HTTP requests that can be simply determined and programmed in advance. The checking tool should now be context-aware that is, able to dynamically determine presence of UI elements on the browser screen, and then to operate them according to your test scenarios. Luckily, you can achieve this with freely available or commercial testing tools. To name a few: Selenium; WATIR; HP Quick Test Professional; SmartBear Softwares TestComplete; IBMs Rational Functional Tester. HTMLUnit currently also provides a certain level of AJAX support.
Case Description
In our case, we had a GWT-based retail sales web console deployed in the cloud, and it needed regular monitoring for availability. A simplified deployment outline is provided in Figure 1 (the
Monitoring AJAX
#!/bin/sh #Runs a Selenium/JUnit smoke test provided in SEL_TEST echo Selenium/JUnit smoke test #Selenium test class SEL_TEST=com.companyname.tests.selenium.smoketest #Selenium test home SEL_HOME=/usr/local/selenium #Selenium home directory SEL2_JARS=$SEL_HOME/selenium-2.0b1 #Selenium RC host SEL_RC_HOST=10.162.42.12 # remote cleanup call via NRPE: CLEANUP_CMD=/opt/nagios/libexec/check_nrpe -H $SEL_RC_HOST -c selenium_clean # call remote cleanup before starting tests $CLEANUP_CMD cd $SEL_HOME # Test run with a Selenium client. Set proper Selenium JAR names # and have Java in your path. java -cp $SEL_HOME/junit-4.8.2.jar:$SEL2_JARS/selenium-java-2.0b1.jar:$SEL2_JARS/ selenium-server-standalone-2.0b1.jar:. org.junit.runner.JUnitCore $SEL_TEST # We do some housekeeping after Selenium/JUnit test run completes, # so lets save the JUnit exitcode to a variable: JUNITEXITCODE=$? echo JUnit exit code is $JUNITEXITCODE #call remote cleanup after completing tests $CLEANUP_CMD # Now exit with the saved JUnit exitcode. # Nagios will use it to judge whether the Selenium/JUnit test run was OK. exit $JUNITEXITCODE
nodes are RHEL 5.5): The limitations for the monitoring solution were as follows: Fast deployment and update cycle; Minimal maintenance efforts; Low footprint (disk, memory) and low CPU consumption; No code recompilation or redeployment for non-major UI changes.
Nagios was already set up for monitoring basic health parameters of the nodes. Based on our previous experience with Selenium, as well as its ability to wrap tests into JUnit tests, it was the obvious candidate for the test driver/test executor. Initially we were considering the possibility of delegating application availability monitoring to external paid services (as BrowserMob or Saucelabs - both are viable options if you want a purely separate monitoring solution running outside of your environment). In that case, however, we would still have to develop a complete Selenium test suite on our own. After estimating the advantages over in-house execution and the related costs, we dropped the idea.
Figure 2: Main shell script (Selenium test suite launched on core node)
#!/bin/sh # Script for cleaning up immediately after/before a Selenium test run # on a local Selenium RC server # Will kill Firefox processes and remove temporary dirs # (Selenium RC often fails to do so). # The user running this script should be given appropriate sudo permissions # in your /etc/sudoers file. sudo pkill -f firefox echo Kill firefox exit code: $? sudo rm -rf /tmp/customProfileDir* & echo Remove firefox customProfileDir dirs exit code: $? sudo rm -rf /tmp/seleniumSslSupport* & echo Remove seleniumSslSupport dirs exit code: $? echo DONE...
Second, the layout of the plugin implied that an embedded Selenium RC server is started along with the test suite each time the test runs. This was undesired Remote cleanup shell script (selenium_clean, called through due to load/performance issues. We
www.automatedtestinginstitute.com
15
instance of Firefox). Gradual disk space leakage due to multiplying profiles also occurs in this case. Thus, before starting another test run it was important to check for hanging browser instances, kill their processes and clean up obsolete temporary browser profiles.
The cleanup procedure was implemented as a separate NRPE command called from the main shell script running the test suite (immediately before and after the test run). Please refer to the code snippets in Figures 2 and 3.
Setting Up Selenium RC
To keep the test run times short and to increase the stability of the environment we decided to set up the Selenium RC server as a dedicated service the main reason being noticeable memory and CPU consumption by the browser under RC control. Selenium RC needs X11 for running in graphic mode (this is required to host the browser in which all operations on the application undertest are performed). However, the cloud nodes were provided headless with no X11 server by default. So we first added a virtual display using XVFB. It was configured to run as a service; we did not need to start it manually each time.
Monitoring AJAX
We did not really need to accumulate performance statistics and other tricky features a result in terms of passed/ failed was sufficient. Hence, we would be safe even with a direct call like this as the Nagios command definition:
java cp $PATH_TO_JUNIT_JAR:$PATH_ TO_SELENIUM_CLIENT_JAR:$PATH_TO_ SELENIUM_TESTSUITE_JAR org.junit. runner.JUnitCore $SELENIUM_TEST_ CLASSNAME
We wrapped the execution of the Selenium test suite, which is actually a JUnit test run, into a shell script. This script performs some remote cleanup before and after the run, and passes the effective exit code of the test suite to Nagios (see Figure 2). The first line of the scripts console output is used as the status message on the Nagios web summary page. The whole output is printed on the details page of the service check. This offers a convenient way of reviewing stack traces of exceptions in case any occurred during the test run. For a successful run, Nagios will report something similar to the chart shown in Figure 4. The illustrations in Figure 6 and 7 represent the final wiring of our monitoring solution.
Summary
Using Selenium and Nagios, we have built a stable, small-footprint and low-cost solution for monitoring the availability and basic functionality of an AJAX web application. The maintenance efforts are minimized thanks to selfcleaning and externalization of the frequently changed test suite parameters in a text configuration file.
References
1. Nagios - http://www.nagios.org/ 2. Nagios plugins - http:// nagiosplugins.org/ 3. NRPE (Nagios Remote Plugin Execution) - http://exchange. nagios.org/directory/Addons/ Monitoring-Agents/NRPE-2D-Nagios-Remote-PluginJuly 2012
www.automatedtestinginstitute.com
17
Its often said that repeating the same action over and over and expecting different results is the definition of insanity. Yet oddly, this seems to have become the standard for many organizations implementing software test automation.
www.automatedtestinginstitute.com
July 2012
he Flaws in on
Single Point of Failure
u ve pra to n S By Clin
Wrong Framework
July 2012
www.automatedtestinginstitute.com
19
www.automatedtestinginstitute.com
July 2012
Addressing Flaws
understand or easily pick up. It is imperative that your framework is simple enough to maintain and well documented so that the company and team dont lose the efficiencies created by the framework in the first place.
Keyword-Driven Testing
Some companies have explored the idea of Keyword-Driven Testing (KDT), which involves building a code library of table-based functions/ action words so anyone can help automate application testing. Now the entire team can help us automate! Its a nice thought, but lets explore the pros and cons of this approach. First, KDT requires less technical expertise to create test automation and involves Business Analysts (BA) and Subject Matter Experts (SME) in the test automation process, while still allowing the automation engineers to do the heavy lifting. It also simplifies the link between testing and requirements specifications. On the flip side, KDT can actually increase the amount of maintenance for test automation efforts, rather than reduce it. For example, imagine that someone from payroll is brought in to test the new accounting application. The employee has been trained on the framework and is presumably ready to go. While testing the app, an error occurs: Object xyz failed to initialize. Shutting down. The test automation guru must then get involved, further complicating a process that a properly trained professional could have handled without assistance. In this sense, KDT can involve SMEs, BAs and testers in the wrong way. The intentions are good, but its setting a trap for failure. All that said, KDT is not necessarily a bad thing. The problem is not how it is implemented, but for whom it is implemented. Again, this goes back to the previous assertion that most people think test automation is so simple that anyone can do it. The entire team does not need to be involved in the test automation process. A better approach is to utilize those on the team that have the technical expertise to develop, maintain, and execute a keyword-driven framework.
Now the entire team can help us automate ! Its a nice thought, but lets explore the pros and cons of this approach.
Remember, test automation is software development, and it is not easy. Building efficiencies into the development process is a difficult undertaking in itself. However, repeating the same mistakes will keep test automation on the crazy cycle of software development. Dont look for the ultimate panacea for test automation, rather look for a practical, realistic approach to building a robust and reusable automation library that will deliver true ROI.
July 2012
www.automatedtestinginstitute.com
21
Schedule At A Glance
TestKIT Conference
(Sign-up today - www.testkitconference.com)
Agile Testing (AG) Performance Testing & Security (PS)
Tracks
Automated Tools & Implementation (AT) Mobile, Virtualization & The Cloud (MC)
Frameworks & Methodologies (FM) Test Management, Teams & Communications (TM)
12:00pm - 1:00pm Lunch 1:00pm - 2:00pm Vendor Exhibition 2:00pm - 5:00pm Tutorial Afternoon Sessions
TUT1: Agile Functional Test Automation (Afternoon Session), Linda Hayes, Worksoft, Inc. TUT4: Preparing for the CISSP, James Hanson, Helm Point Solutions, Inc. TUT5: Transitioning to Agile Testing - The Mind of the Agile Tester, Bob Galen, iContact TBK: Test Automation Body of Knowledge Training - Day 1 (Afternoon Session)
11:45am - 1:15pm Lunch and Keynote Presentation: Keynote, Linda Hayes, Worksoft, Inc. 1:30pm - 2:30pm Breakout Session Group 3
TA3: Test Automation Patterns, Seretta Gamba, Steria Mummert ISS GmbH AG1: Agile Testing: Facing the Challenges Beyond the Easy Context, Bob Galen, iContact
22 Automated Software Testing Visit Magazine www.automatedtestinginstitute.com http://www.testkitconference.com for available speaker bios and session descriptions.
July 2012
Schedule At A Glance
PS1: Production Performance Testing in the Cloud, Dan Bartow, SOASTA, Inc. MC1: Open Source or Commercial Mobile Platform: Which is Right For My Testing Team?, Patrick Quilter, Quilmont TBK: Test Automation Body of Knowledge Training - Day 2 (Afternoon Session)
6:00pm - 8:00pm Dinner Reception: 4th Annual ATI Automation Honors Awards Ceremony
11:45am - 12:45pm Lunch 1:15pm - 2:15pm Discussion Forum 2:15pm - 2:30pm Afternoon Break 2:30pm - 3:30pm Breakout Session Group 8
AT5: Tcl/Tk For Testing, Robert Wimsatt, Sotera Defense Solutions, Inc. MC4: Accelerate Parallel Development with Service Virtualization, Wayne Ariola, Parasoft FM5: Getting It Right The First Time, Nick Olivo, SmartBear TBK: Test Automation Body of Knowledge Exam (Afternoon)
4:30pm - 5:00pm General Session: TestKIT Closeout July 2012 www.automatedtestinginstitute.com Automated Software Testing Magazine 23
A testing
Automation
Plannin
C
The
ustomers expect their banks to be accessible from their mobile devices. They use their mobiles to book flights,
shop and perform most of the actions traditionally associated with desktops.
Enterprise
To
remain
competitive, enterprises are mobilizing their systems and providing instant reliable access to their services. institutions
This
Mobile
device
always
staying
(insurance and banking), health, retail, and travel-service providers. In order to keep up with market needs and to stay relevant,
enterprises are rushed into the mobile industry without appropriate planning and quality assurance.
This
1.1
9.9
billion in
billion in
increase.
Without
2012,
a mobile presence,
nearly a ten-fold
2016,
up from
in turn results in poorly developed applications that lack proper quality and support.
www.automatedtestinginstitute.com
July 2012
ng
July 2012
www.automatedtestinginstitute.com
25
Here today, gone tomorrow is probably the best way to characterize the pace of change in the mobile market. Its safe to say that at least 30 percent of the popular handsets and tablets today will become outdated and irrelevant in the next few months. The mobile market is extremely dynamic, unpredictable and fragmented. The numerous operating systems and multitude of platform versions, networks, hardware, and form factors make it challenging to maintain mobile application quality. Taking a look at OS versions. New devices contain the latest or near-latest OS versions, and usually automatically upgrade to the newest available OS version replacing the older OS version. There are no guarantees that an application developed according to an older OS version will function properly with a newly introduced OS version; enterprises have no choice but to conform to this pace, and continuously develop/test version updates for their applications. Figure 3 provides a mere 6-month timeframe of the Android OS version updates. During January 2011, Android 2.2 OS version was leading approximately half of the Android mobile market. A few months following, Android 2.3.3 took its place. In March 2012, Android 2.3.3 reached over half of the Android mobile market, and is projected to eventually take over the market. This is only one example of the many competing platforms that are available in the market. For a better understanding of the market dynamics, this example should be multiplied by the number of available platforms including iOS, Android, BlackBerry, and Windows Phone. Taken from StatCounter Global Stats, Figure 4 shows the usage growth rates of the top eight mobile operating systems in North America. Here it can be seen how the mobile market unexpectedly July 2012
fluctuates with no defined leader or standards. To ensure the success of an application, all relevant platforms should be covered. Mainly from a performance point of view, mobile networks should also be included in testing.
versions, and connect right away to applications and websites. Although an organization may not be prepared to introduce an updated application version, users expect nothing less than a flawless user experience. In the Mobile market, the risk accumulated between product releases is much greater than with traditional software. This leaves no choice but to accelerate the release cycle in order to limit risk exposure. In conclusion, when it comes to mobile, a shorter development cycle is needed as well as the ability to test an application continuously.
Mobile application testing simply cannot be served by the traditional development/ QA cycle. As stressed previously, the market is extremely dynamic and unpredictable. A tremendous number of customers will instantly adopt newly released mobile devices and OS
Figure 4: Top Mobile OSs in North America (Feb 2010 to Feb 2012)
Source StatCounter global Stats
www.automatedtestinginstitute.com
27
Keep in mind
that with all these difficult challenges, mobile is one of the most exciting technological advancements available today
Figure 6: Software QA vs. Mobile QA (simplified)
Source: Perfecto Mobile
www.automatedtestinginstitute.com
July 2012
Mobile applications undergo a porting process. This creates several different versions of the application with respect to each device. When developing and testing for mobile, short development cycles and continuous QA enable accommodating to the rapid market changes. The two factors measuring the market gap are: what users want (such as new features and functionalities); and what the market offers (such as devices, browsers, and processing power). Increasing the timeframe between versions will increase the application response gap to the market needs. Shortening the release cycle will allow a quicker reaction to the market needs. The gap between market requirements is larger because of various changes and new introduction of technologies and platforms. Shortening this gap will shorten the development cycle, which will require releasing application updates more frequently. This message is a very powerful one for successful mobile applications. Iterative and agile mobile methodologies are more aggressive. Choosing not to quickly release update versions will make an application irrelevant. See the poor application example in Figure 7 with its ratings and user comments.
Automation
is an enabler for success and not a cost reduction tool in testing
The simplified illustration in Figure 6 depicts the shift from traditional to mobile development. The orange highlights show all of the areas that have been affected by mobile. In short, all of the traditional development activities have remained, with some changes, and new activities such as Interoperability and Compatibility (porting) have been added.
Interoperability is a completely new mobile phenomenon. Browsers do not receive phone calls, but phones do. Similar device events, such as an SMS, can occur at any moment, causing unexpected interrupts to a running application or transaction. For example, an online purchase can be interrupted by an incoming call or dropped because of Wi-Fi issues. Traditional software developers have the luxury of assuming all PCs are basically the same, regardless of the manufacturer, CPU and memory. In mobile this is impossible. The differences between devices are too great and cannot be ignored. Compatibility, also known as porting, is therefore added to the development cycle to validate application/ content compatibility, performance and user experience across devices. As a response to the array of available devices,
version is released. However, when releasing a mobile application version, continuous testing is needed because of the ever growing stream of devices and versions.
traditional software release cycle, once an application has been tested in the QA phase, it is released for production. Production is updated when the next
has turned from nice to have into mandatory, particularly when using agile or iterative development methodologies. By automating the functional and 29
July 2012
www.automatedtestinginstitute.com
Best practices for mobile testing indicate the need to access between 30 and 40 fully functional devices. To keep up with the market dynamics, an estimated 10 devices will have to be replaced each quarter. The number of supported devices will grow significantly within the first year of introducing the mobile application to the market. Managing the logistics of these handsets within the different geographical locations is a challenging task. Utilizing a cloud-based solution will allow an enterprise to avoid the hassle and costs of procuring and managing new devices.
regression testing of mobile applications, it is possible to shorten the timeframe and provide an accurate application state snapshot. Automation allows testing on more devices in less time and reduces the requirements gap. The result is a shortened and systematic ALM cycle that allows for continuous QA, better coverage, easier re-creation of problems and substantial cost savings. Do not to underestimate the complexity of launching mobile applications. It is common to experience a cycle of over-optimism following development on the first platform (OS), followed by disillusionment resulting from difficulties once the second platform and associated devices are added to the mix. The realities of an extremely dynamic market require a well-planned and methodical approach. In light of this fragmentation, it is highly recommended to adopt a deviceagnostic testing approach that allows writing test scripts once, and then reusing them on multiple platforms. Script automation should support low-level functionalities, such as key and screen press as well as, logical abstraction which enable the execution of virtual functions, such as a login, that are not dependent on a particular device or 30 Automated Software Testing Magazine
platform. When planning QA, automation is a must. As opposed to some beliefs, automation is an enabler for success and not a cost-reduction tool in testing. Automating testing enables the use of a single testing script across many devices. An example of this is the Perfecto Mobile patented ScriptOnce technology, a comprehensive mobile testing automation solution.
Purchasing devices and having them on a developers desk poses security issues as these devices tend to disappear and limits access to only one physical location. To meet enterprise security standards, it is essential that mobile application testing is performed on secure devices that can only be accessed by the organization. A dedicated cloud of devices ensures that the required devices are always available for testing, and that applications in the development process are always secured. This private cloud should also be configurable to comply with the organizations security policies, including firewall requirements
www.automatedtestinginstitute.com
July 2012
and other needs. As a whole, a cloud-based approach: Enables globally distributed teams to share devices during live testing; Meets enterprise security measures; Targets network availability; Is logistics free.
organization. Companies around the world invest more than $50 billion per year on applications testing and quality assurance, according to Pierre Audoin Consultants (PAC). Rather than reinventing the wheel, it is significantly more effective to utilize existing ALM processes such as a management console, business-logic and high level scripts, and scripting languages. Using a system that extends rather than just integrates with the existing platform is a cost-efficient and timesaving solution. There are existing solutions available. An example solution is the Perfecto Mobile MobileCloud for QTP. Since this solution is a QTP extension, it allows the user to connect to the MobileCloud and execute scripts within QTP, using traditional QTP scripting elements. This leverages existing assets extended to mobile, with a hidden debugging support component. Mobile developers and testers can use the MobileCloud extension to log into specific devices remotely. The MobileCloud for QTP was developed with close collaboration between the Perfecto Mobile and HP development teams. This concept should not be confused with available integrated applications that require working with multiple environments. This is a single environment within a single application, extending the existing ALM to include Mobile. There is no exchange of data items between the two applications. The MobileCloud works within QTP, access
and manipulations are not performed on a separate system. Additionally, this integration goes beyond script writing. It is possible to leverage it to include the full range of HPs ALM tools including Quality Center, LoadRunner and BSM.
To Summarize
Offering an attractive application that remains relevant and available across devices is a challenge. This challenge can be overcome with a combination of a good methodology and tools. This methodology will need to embrace the Mobile timeframe, apply a quick and continuous lifecycle, and utilize existing ALM tools. Automating testing is a must. It enables a quick-pace. A cloudbased approach will help to enable collaboration and to remove all logistical challenges in an effort to keep up with the market pace. It is recommended to use the following ACE selection criteria: Automation Mobile test automation enables a shortened ALM cycle, increases coverage, facilitates re-creation of problems and saves costs. Cloud-based platform Cloudbased access to REAL handsets located in live networks helps avoid the hassle and costs of procuring and managing new devices, while it facilitates distributed teams collaboration. Use Existing ALM Resources Leverage existing tools, processes and knowledge by extending the current ALM framework to support mobile testing.
To address market dynamics, mobile applications need to be developed and tested on multiple platforms. For this reason, it is important to identify between six and eight must devices to run rigorous sanity and regression testing nightly. To achieve a better representation of the market, it is recommended to extend testing to cover approximately 12 major devices during the QA phase. The bulk of the functional and regression testing is to be performed against these devices. Automation in both of these phases is critical in order to allow the release of new applications and functionality to the market in a timely fashion. Regardless of the phase, approximately 30 percent of the devices will need to be replaced each quarter to account for new devices introduced to the market.
Keep in mind that with all these difficult challenges, mobile is one of the most exciting technological advancements available today. Although this is the eye of the storm, it is also the center of technology.
July 2012
www.automatedtestinginstitute.com
31
www.automatedtestinginstitute.com
July 2012
July 2012
www.automatedtestinginstitute.com
33
I BLog To U
Automation blogs are one of the greatest sourc automation information, so the Automated Tes decided to keep you up-to-date with some of th posts from around the web. Read below for so posts, and keep an eye out, because you never will be spotlighted.
Blog Name: Narendra Parihars Blog Post Date: March 29, 2012 Post Title: Test Automation Failures Author: QualitySpreader
Blog Name: Software Quality Matters Post Date: June 22, 2012 Post Title: .HTML5 Test Automation for Beginners Author: Goran Begic
Every now and then we keep seeing automation failures. Most of Testers have been part of these failure stories as actors or audience or directors :-) I am sharing top 3 reasons for automation failure in this post which is kind of little modified from my post on blogspot @ http://infominesoftware.blogspot.com/#!/2010/10/ why-does-test-automation-fail-everynow.html
Everything you type into browser windows, Web page forms, all the buttons you click, pages you open are remembered together with the order with which you interact with the Application Under Test (AUT). These sequences can be played back as tests, they can also be reviewed, updated, or turned into test scripts. The benefit of this approach is that the automation tool can do everything you can do when testing manually and you dont have to script pre-conditions and other setup.
Read More at: http://blog.smartbear.com/software-quality/bid/174155/HTML5-TestAutomation-for-Beginners
www.automatedtestinginstitute.com
July 2012
Blogosphere
ces of up-to-date test sting Institute has he latest blog ome interesting know when your post
Blog Name: Test This Blog Post Date: April 12, 2012 Post Title: Test Automation Scrum Meeting Ambiguity Author: Eric Jacobson Blog Name: 3Qi Labs Post Date: April 18, 2012 Post Title: Automation Best Practices: Building From Scratch Author: Admin
The goal of writing automated checks is to interrogate the system under test (SUT), right? The goal is not just to have a bunch of automated checks. See the difference? Although your team may be interested in your progress creating the automated checks, they are probably more interested in what the automated checks have helped you discover about the SUT.
This is perhaps the most critical aspect of a good test automation implementation. The decisions you make during the Build phase of the implementation will impact you throughout your automation life-cycle. This means the initial building out phase of a proper test automation implementation requires a number of things which we will be covering in this section of the our blog series: Best Practices for Achieving Automated Regression Testing Within the Enterprise
Read More at:
http://3qilabs.com/2012/04/best-practices-for-achieving-automated-regressiontesting-within-the-enterprise-building-your-test-automation-from-scratch-section-1/
July 2012
www.automatedtestinginstitute.com
35
Go On A Retweet
Paying a Visit To
Microblogging is a form of communication based on the concept of blogging (also known as web logging), that allows subscribers of the microblogging service to broadcast brief messages to other subscribers of the service. The main difference between microblogging and blogging is in the fact that microblog posts are much shorter, with most services restricting messages to about 140 to 200 characters. Popularized by Twitter, there are numerous other microblogging services, including Plurk, Jaiku, Pownce and Tumblr, and the list goes on-and-on. Microblogging is a powerful tool for relaying an assortment of information, a power that has definitely not been lost on the test automation community. Lets retreat into the world of microblogs for a moment and see how automators are using their 140 characters.
Chrome uses way more memory than Firefox, Opera or Internet Explorer http://prsm.tc/oCUyvb well done to @opera
Twitter Name: TechWell Post Date/Time: May 7 Topic: Testing Mobile Apps
Wherever You Go: Testing Mobile Applications, Part 2 In part 1 of this interview with Jonathan Kohl on mobile test... http://ow.ly/1jwaWt
The Microblogs
Cartoon Tester: A bug is a bug http://bit.ly/JNWArx
Twitter Name: alanpage Post Date/Time: Jul 9 Topic: Whining About Testing
Re: my testers whining tweet. 1) There are better methods of communication. 2) Testers also seem to whine about *testing* a lot. #imo #ymmv
tech debt is code which a reasonable engineer, in the present, wishes was different http://bit.ly/IpXuL5
July 2012
www.automatedtestinginstitute.com
37
ATI Europe
Automation Day Test Automation Day 2012 was a full day event focused on test automation. Organized by CKC Seminars and held on Thursday, June 21, 2012 in the WTC in Rotterdam,
Help meet the needs of the community on a more localized and personal level Offer training and events for participation by people in specific areas around the world. Help provide comprehensive, yet readily available resources that will aid people in becoming more knowledgeable and equipped to handle tasks related to testing and test automation. Assist in making professional certifications more readily available.
ATI Europe will help achieve these goals in Europe and around the world. You can register as a member of ATI Europe from the general ATI Registration site. In addition, ATI Europe is planning a training event towards the end of 2012, so stay tuned for more.
Keynotes Dion Johnson and Scott Barber Talk at Test Automation Day 2012
TABOK in Japan . This is a greeting to the ATI Community in Japan. Over the past year, demand for the Test Automation (Continued on page 40)
July 2012
www.automatedtestinginstitute.com
Crowdamation
Crowdsourced Test Automation
It
operate out of different locations, address the challenges of different platforms introduced by mobile and other technologies, all while still maintaining and building a cohesive, standards-driven automated test implementation that is meant to last.
Newsflash!
ATI welcomes ATI Europe to the ATI family! Led by Andre Boeters, this latest chapter has been established to address the local needs and concerns of the European automation community. Test automation training is being organized and planned by ATI Europe in conjunction with ATI so stay tuned!
Where in the World is ATI?
(Continued from page 38) strategies, techniques and best practices from peers and leaders in their field relative to security, testing techniques and methodologies, the cloud, test automation, mobile test automation, test tool implementations, open source solutions and more. Although held in the US, this event has a strong international presence. Not only are attendees signing up from abroad, but several speakers helm from outside of the states. In addition to speakers from the US, we will welcome speakers from Buenos Aires, The Netherlands, India, Germany, Russia, Denmark and England. Come network, learn and exchange ideas with like-minded professionals from around the world in an environment that will allow you to build your testkits with concrete takeaways and information that youll use to move your testing and test automation efforts forward. July 2012
Body of Knowledge (TABOK) and TABOK Guidebook has dramatically increased in Japan. The TABOK is a tool neutral skill set designed to help software test automation professionals address automation challenges that are present in the world of software testing. Japans embrace of the TABOK is not necessarily surprising, given the way that William Edward Deming, often credited as being the father of modern day quality, was embraced in Japan during the last century. Japan clearly understands the importance of standardization and the identification of critical skills for effectiveness.
TestKIT 2012 Conference International Presence The TestKIT Testing and Test Automation Conference being held on October 15-17, 2012 at the BWI Airport Marriott in Linthicum, MD, provides a platform for attendees to learn
40 Automated Software Testing Magazine
www.automatedtestinginstitute.com
As a registered user you can submit content directly to the site, providing you with content control and the ability to network with like minded individuals.
>> Community Comments Box - This comments box, available on the home page of the site, provides an opportunity for users to post micro comments in real time. >> Announcements & Blog Posts - If you have interesting tool announcements, or you have a concept that youd like to blog about, submit a post directly to the ATI Online Reference today. At ATI, you have a community of individuals who would love to hear what you have to say. Your site profile will include a list of your submitted articles. >> Automation Events - Do you know about a cool automated testing meetup, webinar or conference? Let the rest of us know about it by posting it on the ATI site. Add the date, time and venue so people will know where to go and when to be there.
Automation Events
http://www.googleautomation.com
Public Courses
July 2012
43
If you thought ATIs 2011 event was good, wait until you see 2012.
http://www.testkitconference.com
44 Automated Software Testing Magazine www.automatedtestinginstitute.com July 2012