A Very Good WiFi APs Stress Test Report PDF
A Very Good WiFi APs Stress Test Report PDF
A Very Good WiFi APs Stress Test Report PDF
http://WlirelessLANProfessionals.com
Executive Summary
Conclusion First
Im sure my Communications teacher from my MBA program would be so proud of me. She taught we should
always think of our readers time first, and put the conclusions at the very front of any document, then follow
with the supporting materials.
The purpose of this test was to take Access Points to their breaking point. In that goal we succeeded
spectacularly. Not a single Access Point was able to support more than 25 iPads streaming video at the same
time, let alone with FTP data transfers going on simultaneously. So on that point the test was successful.
The second goal was to see if Access Points many of which share the very same Wi-FI Chipsets performed
differently, or were they all fairly much the same. Again, this test was able to spread the field and show some
major differences between different vendor access points.
What We Learned
Cable Trumps Wireless
This is a big Duh for most of you. In our testing, the baseline wired data transfers were 3-4 times faster than
the best of the Wireless tests. Not only are wired connection much faster, they are more reliable. Most of all
wired connections are through a switch where you dont have to share a Contention Domain or Collision
Domain like in Wireless situations. This is especially important for devices that need to be consistently on the
network Im thinking especially for things like printers, servers, and Apple TVs.
extra layer of insight into what was happening at Layer 1 and Layer 2 during the actual tests. We also used a
Fluke AirCheck as a quick backstop to the larger tools letting us quickly double-check the Test Access
Points were operating within the test procedures.
In your own Wireless LAN practice, you need to have tools to give you the same insight into what is
happening at Layer 1 and Layer 2. Most network tools operate at Layer 3 and above and sometimes arent the
best at helping you with Wireless LAN issues.
One final thing of note that we learned in the development and execution of this Wi-Fi Stress Test is:
Vendor Independence
Ive personally been frustrated in the past whenever Ive asked vendor QA or Engineering departments to
confirm they have run these types of simple tests under load they hem, and haw and say they are not realworld, or they dont have the equipment, or dont have the time. My customers and clients are asking for
some level of proof or confirmation their investment will meet certain goals.
So instead of complaining, we just went out and purchased all the equipment for this test on our own.
Nothing was donated or gifted or paid for by any vendor. (by the way, now that Wireless LAN Professionals
owns this testing lab wed be very grateful if any readers know of anyone would like to rent said lab
equipment)
This test was contrived, developed, and tuned by Wireless LAN
Professionals. We tried to be as fair as possible for all vendors.
Im sure we failed at many levels. The design was to minimize
any one vendors advantages.
Some vendors brought their Access Points to be tested, then
took them with them when they left.
In order to make the test as vendor independent as we could,
we adapted the test to make minimize any single vendors
advantages. As an example, we moved the test Access Point
location to not be centered over the iPads so Ruckus beam
forming and interference rejection wouldnt give them an unfair
advantage. We had Xirrus turn off their other radios so theyd
compete with only two radios like everyone else. These are just
two examples, but we did try to make the test environment as
vendor-agnostic as possible.
One little note here you might have noticed all the client devices were made by Apple. In fact multiple
people asked if Apple was a sponsor. Sorry, no. Though it would have saved us a bunch of money if they had.
All the Apple devices were purchased from the Apple online store just like anyone can.
To help with this independence, we invited any and all who were interested to come and help volunteer with
the test procedures or just observe the process. We were grateful to have over 30 different volunteers
throughput the testing week to come and help. These people represented 5 school districts, 2 universities, 7
WLAN Vendors, and 4 different competing resellers.
A side benefit of this was the opportunity to rub shoulders and hang out talking tech with some very
experienced and cool WLAN Professionals! In fact, it was so enjoyable to have other WLAN Professionals
from competing companies all working together, many people mentioned we should do this type of thing
more often, perhaps once a quarter or so - I agree!
OK, the marketing guys werent nearly as happy to be with competitors as the techies. We all just liked
hanging out and doing cool technical stuff together. There arent many chances for that happening.
Repeatability
Another goal of the test was to be easily repeatable by anyone who wanted to replicate our
test. So we opted for free or open-source software for our testing. Though Im sure IXIA and
Veriwave make some great stuff for lab work, we wanted our choice of tools to be accessible
by anyone.
We chose Filezilla for FTP uploads/downloads, jPerf as a front
end for iPerf testing, and Zapper on the iPad for Zap testing.
The video player was written in HTML5 specific for this test.
(if anyone wants a copy, Ill check with the HTML coding house
we purchased it from to see what the licensing issues might be)
10
Answers to Questions
Over the years those of us working as Wireless LAN Professionals have been bombarded by questions about
Wi-Fi and Access Points. Here are just a couple of samples:
Why does my Wi-Fi at home work better than here?
Are we ready for a 1:1 initiative?
Can we handle 30 iPads in a single room streaming unique videos?
Just how much traffic can one Access Point handle?
Arent all Access Points the same?
Why are we spending all this money on Enterprise Access Points? Cant we just buy one down at Best
Buy?
We set out to help answer some of these questions.
Of course, in this simple, single-AP test, we arent going to be able to answer all of them but we wanted to
take a good crack at it.
11
This test is NOT the best possible way to evaluate access points we ignored many of the pressing
issues of why someone might buy one vendor Access Point compared to another vendors AP.
This is a simple test only comparing throughput combined with video going to multiple iPads at the
same time. There are many many things we never addressed. Again a very simple throughput/video
test.
The goal was to hold all things constant, only changing the access point. Things like outside
interference, changes in bags of water moving between Access Point and iPads, etc. made for slight
changes in the environment. On balance, we think we did an admirable job of holding things between
tests as constant as possible.
Why are you breaking our Access Point well, this IS a Wi-Fi Stress Test, and like the bridge, we
wanted to push the envelope and see if we could get all Access Points to fail within our allotted
resources. (before we ran out of iPads)
This isnt reflective of real-world. How many places want to stream 30 videos AND transfer huge
chunks of data at the same time? Again refer to the Stress Test in our description. We were trying to
take Access Points to the breaking point. In our defense, we have had customers ask specifically for
this scenario.
In most environments you will have multiple Access Points covering any given area. That is quite true,
and we might look at doing multiple Access Point tests in the future. This was a simple, single Access
Point test.
Why didnt you use Android, Windows, or fill-in-the-blank? Basically, since Wireless LAN Professionals
purchased all the testing equipment, we didnt want our money to be spent on non-Apple devices. No
underlying conspiracy. We like Apple devices and since well have to be using these on future projects,
we opted to buy what we liked. Im sure if we had a different client mix, with more Android and
12
Windows we might have seen different outcomes. (Plus the MacBook Pros have 3x3:3 Wi-Fi NICs)
You didnt test ______________ - fill in the blank with Architecture, Manageability, Security, Firewall,
Layer 7, GUI, Price, Ease of Installation, or a myriad of other things we did not test. This is true. We
didnt test any of those very important things. We had to set the boundaries somewhere, or this would
have taken even longer than it has.
What code did each vendor use? We asked each Access Point vendor for not only the code version and
changes they made in their configurations, but also a URL so anyone wanting to repeat the test could
download said code. This also ensured each participant didnt use any custom code but a shipping
version of their firmware. Many updated the firmware to latest revision right in the testing lab. By the
way, in subsequent documents, well be listing each access point and these details so others could
replicate this test.
13
Test Environment
The test classroom was graciously donated by Canyons School District. It was a 32 x 28 portable classroom
with measured 3dB wall attenuation.
The test Access Point was located in a room next door, 8 from the wall, mounted on a T-Rail on the ceiling.
All four MacBook Pros were color-coded and placed in the same
location between tests, and their Wi-Fi was reset between tests as well.
One was used as an FTP client, one as an iPerf client. The two others
were spares in case we needed replacement during the tests. The
MacBook Pros have the latest OS X operating system installed and
updated prior to the tests.
There were 30 iPads all were version v4 and
supported both 20MHz and 40MHz channels,
but only with a 1x1:1 radio chain. Each was updated to the latest iOS software before
the set of tests. Each iPad had a Reset Network Setting in between each test. We
used the Safari Browser and opened an HTML5 video player from the video server.
Each iPad was placed within a taped location. Portrait, Landscape, Stand to Front,
Stand to Back, and Standing Vertical. This was to reflect people dont always hold an
iPad the same way, but we wanted test-to-test reliability. So each iPad stayed in its
assigned location during the tests.
Wireless LAN Professionals, Inc- 2013
14
15
Test Monitoring
During the tests, there were a couple of projectors running so all could observe the test from different points
of view.
Management Station
The center projector held the management station, whatever test Access Point management interface. This
allowed all to see the configuration, as well as track band-steering and band-balancing in real-time. OK, some
times we had to manually point at the screen and count how many iPads were on each band This might be a
good bit of feedback for the Access Point vendors to incorporate in their next revision.
16
17
Note we also planned on using iPad Minis coupled with Apple TVs to represent nearby classrooms where
teachers were using their Apple TVs. During our tuning section we found this to add too much load to the
channels and caused Access Points to fail even sooner. So we opted to put the Apple TVs back in the test
after the 30 iPad limit was reached. Since no Access Point met the 30-iPad limit, we never needed the Apple
TVs.
18
Testing Process
Since the goal was to take the Access Points to their breaking limit, we didnt want to cause undo strain by
having each iPad negotiate the association process during the middle of the test. So we went through the
following steps in our testing process.
1. Reset all wireless clients iPads had a Reset Network
Setting, and MacBook Pros did a Wi-Fi Off/On
2. New Test Access Point was installed and we used Fluke
AirCheck to confirm SSID, and Channel Settings
3. All iPads and MacBook Pros were associated to the new
test SSID and a Safari Browser started and the Video
Players web page refreshed. Then they were put to sleep.
4. After all devices were asleep on the Wireless LAN, we fired
up the FTP client and started an FTP download from the
wired FTP server wirelessly to the MacBookPro.
Sometimes this was band-steered to 2.4GHz channel 11,
but most of the time it went to 5GHz channel 36. The
MBPs have a 3x3:3 Wi-Fi NIC and could support up to a 450
Mbps connection. This varied based on target test Access
Point.
5. A 600 MB file was transferred from FTP server to the MBP
and the time captured. Then the same file was uploaded
back to the FTP server and that time was recorded.
6. Then on a different MacBookPro at the front of the classroom we started an iPerf session of 20 seconds
using default TCP details. (no super-sized frames or anything special set on iPerf) this data throughput
amount and rate was also recorded. This was the No Load test.
7. A combination of total Bytes Transferred divided by total time nets the Aggregate Data Rate used in
subsequent graphs and charts.
8. We then fired up the first five iPads connected via Wi-Fi and started the video playing. We continued to
watch and track errors on the iPads as another set of FTP Download, FTP Upload, and iPerf were
completed.
9. Then the next five iPads were started, and the process was repeated.
19
10. We continued with this process until more than 50% of iPads were counted as dead, or the FTP estimate
exceeded 20 minutes.
We had a set of criteria to track whether or not an iPads video issues should be counted as an error thus
getting a tick in our tracking sheet.
20
21
Test Video
We also had many questions concerning the test video process. So well address them right now. We were
using a custom HTML5 video player hosted on a Windows 2008 IAS server using 16GB of RAM and an SSD
hosting the file.
To test the video bandwidth used, we first used one of the MacBookPro
laptops, with an internal 3x3:3 Wi-Fi NIC connected at 450Mbps to a test
Access Point loading the same video the iPads used. During this test, the
MacBook Pro downloaded the video file at 120 Mbps+ until the entire video
was in its buffer.
Next we tried opening multiple video players to see how much traffic load we
could generate to the Access Point then onto the video server. With over 30
video players running, the interface from the switch to the video server was processing over 240Mbps.
This is just to show we were NOT running a Multicast video, NOT running a YouTube compressed video, but a
full 1080P video file sent to the HTML5 player in the iPads Safari Browser.
The Video Server was NOT the bottle neck, we could see on the Spectrum Analyzer and on the AirMagnet WiFi Analyzer Pro screens the RF frequency saturation was causing the errors and slow-down of throughput.
Using packet-analysis during and after the tests we could see various traffic loads going to each iPad.
Sometimes between 2-3Mbps other times much slower.
We specifically chose this method and size of movie clip to cause as much load as possible on the Wi-Fi
equipment.
22
23
In the real world, by allowing 40MHz channels, using smaller videos, etc. you could achieve better results than
we saw in our lab tests.
24
25
Note the second Access Point has a nearly same Retry of 13% - but the average data rate is only 49Mbps
causing quite a bit more airtime not used for data transfers.
These differences in how Access Points use the same classroom environment, the same iPad devices, is
something vendor engineering teams can start to look at and refine their algorithms to be more effective.
26
Another example of these differences between Access Points can be seen in the Spectrum Views Here are
two screen shots of two different Access Points but both with 10 iPads plus FTP running in the same
frequency.
Note the first Access Point is using far more RF time
Note the second Access Point has more open time and availability in the Air Time to allow for more efficient
use of frequency.
The Results
Wireless LAN Professionals, Inc- 2013
27
Phew, after this long-winded diatribe on all the things we learned from the tests many of you just skipped
ahead to find these rankings. Here we go.
Again, these are just one time slice view, of a single Access Point test using an FTP load on top of multiple
iPads streaming video. These rankings do NOT signify which Access Point is better in any one situation. They
merely show results of this set of tests.
First, well review the throughput and iPad video errors as groups compared to averages.
Next well show the rankings for throughput, iPad video errors, and then an aggregate overall ranking.
Later after the rankings, there will be individual Access Point results comparing each Access Point to the test
averages. Rather than spend your time on the complicated graphs will all Access Points compared with each
other. It might be easier to compare each to the average. The scales of all graphs have been equalized so
things should be as consistent as possible.
As a side-note during the testing procedures, many of the volunteers could feel that one Access Point was
performing better than others. I too had those feelings sometimes that a given iPad was especially
problematic, or that the Linksys did amazingly well but after the tests were complete, and the numbers were all
tallied, and analyzed. We found many of those feelings to be unjustified by the actual data collected.
28
Note The wide ranges, especially at the very beginning when all Access Points were under a No Load
situation. There is quite a large spread, from 48Mbps to 21Mbps. Astounding, since most were running the
same type of silicon. This goes to show the differences each vendor applies with antennas, tweaks to the
radios and amplifiers, as well as logic and algorithms in the firmware.
Wireless LAN Professionals, Inc- 2013
29
Here is this same type of graph, but with all tested Access Points together. This one is very hard to read and
comprehend. Obviously they all start out better on the left, under No Load, but get progressive worse as we
added more iPads. Some died earlier than others. It is better to be up to the right.
Dont worry; well follow up with other individual graphs later in the document that are easier to read.
30
31
32
33
34
35
36
Here is a graph showing all Access Points on the same graph. This can be a bit confusing with all the lines
muddled together. On a further page, well show each Access Point individually against the group averages
for easier analysis.
Down and to the right is better.
37
38
39
40
41
42
43
Throughput Ranking
This ranking shows those Access Points who had the best Aggregate Throughput scores. Again, this is the
total number of Bytes transmitted; both upload and download for FTP added to the iPerf download, divided
by the total time.
1. Ruckus 7982
2. Cisco 2602i
3. HP 430
4. Xirrus 4820
5. Cisco 3602i
6. Aerohive AP330
7. Aruba 135
8. Juniper 532
9. HP 460
10. Aruba 105
11. Meraki MR24
12. Aerohive AP121
13. Ruckus 7363
14. Meraki MR16
15. Ubiquiti UniFi Pro
16. Linksys EA4500
44
45
46
We didnt touch on any of the more difficult and harder to quantify issues like WLAN
Architecture, Manageability, Services at the Edge, Security, Forwarding, Dynamic VLANs,
Etc.
We also didnt touch on issues that happen with more than one Access Point. Almost all
scenarios will have more than one AP, and we didnt touch on this at all.
We also only tested, mostly, the iPads which only support 1X1:1 spatial streams there are
many more types of Wi-Fi devices that are better equipped to handle multiple spatial
stream traffic.
Not to mention comparing price and features and scalability phew!
Wed like to encourage feedback and ideas for further tests. Please contact me at [email protected] with
your thoughts.
Keith Parsons
Managing Director
Wireless LAN Professionals, Inc.
281 South Vineyard Road - #104
Orem, UT 84058
47
Define
Work with your team to make sure everyone agrees on just what they want from their Wireless LAN
Design
Using state of the art techniques and technologies to meet the design requirements within the design
constraints
Implement
Work with local contractors to install cabling, backhaul and on-site Access Points
Validate
Post-installation validation surveys are critical and must be done for each installation how else do you know
that it meets your design goals?
Evaluate
Full project analysis from end to end to confirm customer received the very best possible Wi-Fi that is
available within their budget
48
Testing
As youve seen in this report, we have the equipment and skills to do Wi-Fi Stress Testing or other Wi-Fi tests.
Some of these tests have been for specific customers, and we can customize test processes for your needs
too.
Web Resources
Also feel free to check out our website for more information, white papers, downloads, and of course you can
listen in to over 40 podcasts called Wireless LAN Weekly.
http://WirelessLANProfessionals.com - http://WLANPros.com
49
50
Aerohive AP121
This Access Point was configured by an Aerohive Engineer.
51
Aerohive AP330
This 3x3:3 Access Point was configured by an Aerohive Engineer.
52
53
Aruba 105
This Aruba was using Aruba Instant with Default configurations.
54
Aruba 135
This was using Aruba Instant and default configurations.
55
Cisco 2602i
This was configured with help from the local SE and some volunteers who run Cisco networks in large
University settings.
56
Cisco 3602i
This was configured with help from the local SE and some volunteers who run Cisco networks in large
University settings.
57
HP 430
This was configured by an HP engineer.
58
HP 460
Configured by an HP engineer
59
Juniper 532
Configured by Juniper SE.
60
Linksys EA4500
Chosen because it was supposedly the best SOHO Access Point on the market. We wanted to test SOHO vs
Enterprise Access Points. It was configured with its standard Web Interface and set to test parameters. It did
not have band-steering so it was only using 1 5GHz radio.
61
Meraki MR16
This device was configured and managed remotely by a Meraki SE during the test. Meraki felt like they
needed more warning before this test in order to better prepare. Their concerns are noted here for
completeness.
62
Meraki MR24
This device was configured and managed remotely by a Meraki SE during the test. Meraki felt like they
needed more warning before this test in order to better prepare. Their concerns are noted here for
completeness.
63
Ruckus 7363
This 2x2:2 Access Point from Ruckus was configured by a local Ruckus SE.
64
Ruckus 7982
This 3x3:3 Access Point was configured by a local Ruckus SE.
65
66
Xirrus 4820
This 8-radio array was re-configured to only use two radios, one on channel 11 and the other on channel 36.
This Access Point was configured by a local Xirrus SE.
67
Wed like to encourage feedback and ideas for further tests. Please contact me at [email protected] with
your thoughts.
Keith Parsons
Managing Director
Wireless LAN Professionals, Inc.
281 South Vineyard Road - #104
Orem, UT 84058
68
Define
Work with your team to make sure everyone agrees on just what they want from their Wireless LAN
Design
Using state of the art techniques and technologies to meet the design requirements within the design
constraints
Implement
Work with local contractors to install cabling, backhaul and on-site Access Points
Validate
Post-installation validation surveys are critical and must be done for each installation how else do you know
that it meets your design goals?
Evaluate
Full project analysis from end to end to confirm customer received the very best possible Wi-Fi that is
available within their budget
69
meet goals of most customers needs quickly and efficiently. Not only do we use these tools, but we can teach
your team how to best implement them in your own Wireless LAN.
Testing
As youve seen in this report, we have the equipment and skills to do Wi-Fi Stress Testing or other Wi-Fi tests.
Some of these tests have been for specific customers, and we can customize test processes for your needs
too.
Web Resources
Also feel free to check out our website for more information, white papers, downloads, and of course you can
listen in to over 40 podcasts called Wireless LAN Weekly.
http://WirelessLANProfessionals.com - http://WLANPros.com
70