Broadband Speed Study
Broadband Speed Study
Table of Contents
Table of Contents 1
EXECUTIVE SUMMARY 2
METHODOLOGY 8
SUMMARY OF FINDINGS 14
PERFORMANCE VARIATION BY ISP AND SERVICE TIER 14
Chart 1 shows average download performance over a 24-hour period and
during peak periods across all ISPs. Most ISPs delivered actual
download speeds within 20% of advertised speeds, with modest
performance declines during peak periods. 14
PERFORMANCE VARIATION BY ACCESS TECHNOLOGY 16
PERFORMANCE VARIATION BY ACCESS TECHNOLOGY 17
PERFORMANCE VARIATION BY SERVICE TIER 18
Download Peak Period Throughput 18
Upload Peak Period Throughput 19
Burst Versus Sustained Download Throughput 20
Latency 22
PERFORMANCE VARIATION BY TIME OF DAY 24
ACTUAL VERSUS ADVERTISED SPEEDS 25
ACTUAL VERSUS ADVERTISED SPEEDS 26
ACKNOWLEDGEMENTS 30
Executive Summary
To make informed choices about purchasing and using broadband, consumers need to have
access to basic information about broadband performance. Will a particular offering allow me
to browse the web quickly and easily? Will it enable me to use new applications that help me
maintain my health, search for a job, or take courses online? What should I look for in a
provider if I want to watch high definition online video or play online video games? Does a
given speed tier have sufficient upload capacity to enable video conferencing? Will a higher
speed, higher priced service improve my Internet experience? Can I get by with a lower priced
service? And does the speed a provider advertises match the actual speed I will receive at my
home? To help answer these questions, this Report presents the results of the first rigorous,
nationwide study of actual home broadband performance in the United States.
Currently, information that would allow consumers to answer these questions is not readily
available in a consistent and easily understandable form, and studies show that consumers’
awareness of their broadband service and its characteristics is limited. A recent FCC survey
found that 80 percent of consumers did not know what speed they purchased from their
Internet Service Provider (ISP),1 and during the course of the study outlined below, we found
that a more modest but still sizable 49 percent of consumer volunteers inaccurately reported
the advertised broadband speed they believed they had purchased from their ISP. Another
study conducted in 2010 found that 13 percent of consumers who have broadband in the
home do not know whether they purchased a basic or premium service.2
This lack of consumer awareness of basic elements of broadband performance led to the
recommendation in the National Broadband Plan (NBP), released last year, that the
Commission undertake several initiatives to help improve the availability of information for
consumers.3
As part of the NBP, in March of 2010 the FCC made available a consumer-initiated online test
of broadband speed.4 The purpose of the Consumer Broadband Test is to give consumers
additional information about the quality of their broadband connections across their chosen
ISPs’ networks and to increase awareness about the importance of broadband quality in
accessing content and services over the Internet. The Consumer Broadband Test has gathered
data about how well the Internet is functioning, both generally and for specific ISPs at specific
times. But the results of the software-based Consumer Broadband Test do not always capture
the baseline connection quality provided by the consumer’s broadband service: the core
connectivity between an ISP and its subscribers, rather than between the rest of the Internet
and those subscribers. For instance, results of software-based tests can vary depending on the
end user’s computer, the type of connection between the end user’s computer and the ISP’s
network (e.g., the use of an in-home WiFi router may affect test results), the number of end
user devices connected to a broadband service, and the physical distance of the end user from
the testing server. Additionally, there is no standard testing methodology for software-based
broadband performance tests, and the Consumer Broadband Test therefore uses two
alternative testing methodologies, which also affects the results.5 In order to assess the speed
claims made by ISPs, and to see how particular activities – such as browsing the web or
watching streaming video – are impacted by different speeds, we decided to complement the
more general Consumer Broadband Test with more consistent tests of the speed of broadband
delivered to American homes.
The Commission has opened a public inquiry into the availability of information regarding
broadband performance. Specifically, the Commission has issued a Notice of Inquiry on the
topic of general consumer information and disclosure requirements,6 which sought comment
on the types of information that consumers need to make informed choices. In April 2011,
the Commission also issued a Public Notice seeking input on the particular types of
information that are most useful to consumers in assessing which broadband services to
purchase, and in particular which technical parameters have the most significant effects on
common consumer uses for broadband.7
This Report responds to another NBP recommendation: that the Commission obtain and
publicly release detailed and accurate measurements of consumer broadband performance on a
national level.8 Such measurements can help inform consumers and create a mechanism for
checking ISP broadband performance claims and for comparing ISPs in meaningful ways.
This Report presents results of the first nationwide performance study of residential wireline
(or “fixed,” as opposed to mobile) broadband service in the United States using measurement
technology deployed in the consumer’s home, focusing on three technologies—digital
subscriber line (DSL), cable, and fiber-to-the-home.9 The study examined service offerings
from 13 of the largest broadband providers10—which collectively account for approximately 86
percent of all U.S. wireline broadband connections—using automated, direct measurements of
broadband performance delivered to the homes of thousands of volunteer broadband
subscribers during March 2011.11 This Report focuses on major findings of this study, while a
separate Appendix provides a detailed description of the process by which the measurements
were made and describes each test that was performed. In addition, the Commission is making
available the following resources: electronic copies of the charts included in the Report; data
sets for each of the charts in the Report; resources regarding the underlying methodology by
which the data was collected and calculated; tabular results for each test performed and data
sets for recorded data for March 2011; and the complete raw bulk data set for all tests run
during the testing period.12
The results contained in this Report will enable consumers to compare the actual performance
of different broadband offerings with a new level of detail and accuracy. In addition, the
methodology developed in this study can serve as a tool to help broadband providers,
including those that did not participate in this process, measure and disclose accurate
information regarding the performance of their broadband services. The Appendix and the
complete raw bulk data set will be useful to the research community in examining performance
characteristics of broadband services in the United States, and in encouraging the development
of new broadband performance testing methodologies in the future. We hope that
independent investigation of this data set will provide additional insights into consumer
broadband services.
Unless explicitly stated otherwise, all of the findings in this Report reflect performance during
the peak consumer usage hours of weekdays from 7:00 pm - 11:00 pm local time. We focus on
this period of time since it is during such “busy periods” that consumer usage of broadband
services is greatest and it is also during this period that the greatest performance degradation
occurs.
Throughout this Report we use the term “advertised speed” to refer to the speed ISPs use to
advertise and market a particular broadband service, e.g., “1.5 Mbps13 DSL” versus “7 Mbps
DSL.” Generally ISPs do not expressly guarantee advertised speeds, but rather may describe
an advertised speed as an “up to” speed, suggesting that consumers can expect to experience
performance up to the advertised speed, with actual performance varying based upon network
conditions and other factors.
We also use the term “sustained speed” throughout this Report. Broadband Internet access
service is “bursty” in nature. On a short time scale, broadband speeds or information rates
may vary widely, at times approaching or even exceeding advertised speeds and at other
times—due to network congestion—slowing to rates which may be well below advertised
speeds. In this Report, to provide an estimate of long-term average broadband performance,
we define sustained speed as speed averaged over a period of several seconds (note that
sustained speed does not necessarily mean that actual speed stays above the sustained speed
average for the entire period).14
Based on the foregoing, the major findings of this study include the following:
· Actual versus advertised speeds. For most participating broadband providers,
actual download speeds are substantially closer to advertised speeds than was
found in data from early 2009 and discussed in a subsequent FCC white paper,
though performance can vary significantly by technology and specific provider.15
· Sustained download speeds. The average16 actual sustained download speed
during the peak period was calculated as a percentage of the ISP’s advertised
speed. This calculation was done for different speed tiers offered by each ISP.
o Results by technology:
§ On average, during peak periods DSL-based services delivered
download speeds that were 82 percent of advertised speeds, cable-
based services delivered 93 percent of advertised speeds, and
fiber-to-the-home services delivered 114 percent of advertised
speeds.17
§ Peak period speeds decreased from 24-hour average speeds18 by .4
percent for fiber-to-the-home services, 5.5 percent for DSL-based
services, and 7.3 percent for cable-based services.
o Results by ISP. Peak period download speeds varied from a high of 114
percent of advertised speed to a low of 54 percent of advertised speed.
§ Only three ISPs had speed decreases of 10 percent or greater
during the peak period (as compared to 24-hour average speeds).
· Sustained upload speeds. Peak period performance results for upload speeds
were similar to or better than those for download speeds.
o Upload speeds were not significantly affected during peak periods,
showing an average decrease of only 0.7 percent from the 24-hour average
speed.
§ Results by technology: On average, DSL-based services delivered 95
percent of advertised upload speeds, cable-based services
delivered 108 percent, and fiber-to-the-home services delivered
112 percent.
§ Results by ISP: Upload speeds among ISPs ranged from a low of 85
percent of advertised speed to a high of 125 percent of advertised
speed.
· Latency. Latency is the time it takes for a packet of data to travel from one
designated point to another in a network. Since many communication protocols
depend upon an acknowledgement that packets were received successfully, or
otherwise involve transmission of data packets back and forth along a path in the
network, latency is often measured by round-trip time. Round-trip time is the
time it takes a packet to travel from one end point to another, and for an
acknowledgement of successful transit to be received back. In our tests, latency is
defined as the round-trip time from the consumer’s home to the closest19server
used for speed measurement within the provider’s network.
o During peak periods, latency increased across all technologies by 6.5
percent, which represents a modest drop in performance.
§ Results by technology.
· Latency was lowest in fiber-to-the-home services, and this
finding was true across all fiber-to-the-home speed tiers.
· Fiber-to-the-home services provided 17 milliseconds (ms)
round-trip latency on average, while cable-based services
averaged 28 ms, and DSL-based services averaged 44 ms.
§ Results by ISP. The highest average round-trip latency among ISPs
was 75 ms, while the lowest average latency was 14 ms.
· Effect of burst speed techniques. Some cable-based services offer burst speed
techniques, marketed under names such as “PowerBoost,” which temporarily
allocate more bandwidth to a consumer’s service. The effect of PowerBoost is
temporary—it usually lasts less than 15 to 20 seconds—and may be reduced by
other broadband activities occurring within the consumer household.20 Burst
speed is not equivalent to sustained speed. Sustained speed is a measure of long-
term performance. Activities such as large file transfers, video streaming, and
video chat require the transfer of large amounts of information over long periods
of time. Sustained speed is a better measure of how well such activities may be
supported. However, other activities such as web browsing or gaming often
require the transfer of moderate amounts of information in a short interval of
time. For example, a transfer of a web page typically begins with a consumer
clicking on the page reference and ceases when the page is fully downloaded.
Such services may benefit from burst speed techniques, which for a period of
seconds will increase the transfer speed. The actual effect of burst speed depends
on a number of factors explained more fully below.
o Burst speed techniques increased short-term download performance by as
much as 52 percent during peak periods for some offerings, and as little as
6 percent for other offerings.
· Web Browsing, Voice over Internet Protocol (VoIP), and Streaming Video
o Web browsing. In specific tests designed to mimic basic web browsing—
accessing a series of web pages, but not streaming video or using video
chat sites or applications—performance increased with higher speeds, but
only up to about 10 Mbps. Latency and other factors limited
performance at the highest speed tiers. For these high speed tiers,
consumers are unlikely to experience much if any improvement in basic
web browsing from increased speed–i.e., moving from a 10 Mbps
broadband offering to a 25 Mbps offering.
o VoIP. VoIP services, which can be used with a data rate as low as 100
kilobits per second (kbps) but require relatively low latency, were
adequately supported by all of the service tiers discussed in this Report.
However, VoIP quality may suffer during times when household
bandwidth is shared by other services. The VoIP measurements utilized
for this Report were not designed to detect such effects.
o Streaming Video. Test results suggest that video streaming should work
well across all technologies tested, provided that the consumer has
selected a broadband service tier that matches the quality of streaming
video desired. For example, standard video is currently commonly
transmitted at speeds below 1 Mbps, while high quality streamed video
might require 2 Mbps or more. Consumers should understand the
requirements of the streaming video they want to use and ensure that
This Report and associated supporting material can be found online as listed below:
Online Resources
· Report and Appendix: http://www.fcc.gov/measuring-broadband-america
· Electronic copies of charts included in Report: http://www.fcc.gov/measuring-
broadband-america/charts
· Data sets for each of the charts in the Report: http://www.fcc.gov/validated-
march-data-2011
· Resources regarding the underlying methodology by which the data was collected
and calculated: http://www.fcc.gov/measuring-broadband-america/methodology
· Tabular results for each test performed and data sets for recorded data for March
2011: http://www.data.fcc.gov/download/measuring-broadband-
america/statistical-averages-2011.xls
· Complete raw bulk data for all tests run during the testing period:
http://www.fcc.gov/measuring-broadband-america/raw-bulk-data-2011
Methodology
The study detailed in this Report, which took place from February through June of 2011,
represents the first comprehensive analysis of wireline broadband performance in the United
States. The techniques used in the study, which are described in more detail below and in the
Appendix, were developed through a collaborative process involving 13 major ISPs, academics
and other researchers, consultants, and consumer organizations.
It is important to note some limitations in this effort. Only the most popular service tiers
within an ISP’s offerings were tested, even though some service providers may offer additional
tiers.21 In addition, the data collected is only analyzed at the national level, and does not permit
meaningful conclusions about broadband performance at the local level.22 The results only
include measurement of the data path from content source to the consumer, and any
bandwidth limitations or delays incurred in the consumer’s home or in segments of the
Internet outside an ISP’s network are not reflected in the results.
For practical reasons, certain consumer broadband technologies are not analyzed in this
Report. Mobile broadband services, which are increasingly important to consumers, were not
included in this study due to the special challenges inherent in measuring the actual
performance of mobile networks. The FCC has issued a Request for Information on
measurement approaches for mobile broadband, as well as a Public Notice on this topic,23 and
is undertaking additional efforts to collect performance data on mobile data services. Due to
the small number of consumer volunteers for satellite and fixed wireless services in the current
study, limited data was collected on these technologies, and consequently these results are not
included in this study. However, the data captured for both of these technologies is included
in the raw bulk data set.
At the outset of this study, the Commission launched an open competition for entities that
could assist with the design and management of a study of broadband performance. The FCC
ultimately selected SamKnows to administer the FCC’s broadband performance testing
initiative. SamKnows is an analytics company that had recently completed a similar study of
broadband performance for Ofcom, the United Kingdom’s telecommunications regulatory
agency. In July 2010 Commission staff held an open meeting to announce the start of the
project and to seek input from interested parties. At the inaugural and subsequent meetings,
industry, consumer groups, and academic attendees expressed an interest in participating in the
study.
Overall, 22 stakeholders contributed to this project, including 13 wireline ISPs; academic
researchers from MIT and Georgia Tech; technology vendors and consumer groups; and other
industry representatives.24 Most stakeholders, including all participating ISPs, signed a Code of
Conduct, included in the Appendix to this Report, which helped ensure the integrity of the
study and its results. Participants contributed significantly to this project by, among other
things: creating and agreeing on a standard methodology for testing broadband performance;
collaborating on the parameters for these tests; providing proposals for how to analyze the
data; validating the panelist information to ensure that the test results were properly correlated
to the correct service tier; and developing strategies to maintain the privacy of the panelists and
the integrity of the testing. Commission staff also held a number of open meetings where the
public was able to express views on the study.
The basic objective of the study was to measure broadband service performance as delivered
by an ISP to the home of a consumer. As illustrated below, many factors contribute to end-to-
end consumer broadband performance.
Figure 1: Network diagram
Not all elements of broadband performance are under the control of the consumer’s ISP, and
there are factors that affect a consumer’s broadband experience that were not measured in this
study. For example, broadband performance experienced by the consumer may be affected by
the capabilities or limitations of the consumer’s own computer or local area network (LAN)
devices such as home WiFi routers, or by the performance of the content and application
providers the consumer is accessing, such as a search engine or video streaming site. In these
instances, the broadband provider is controlling only a portion of the chain that determines the
overall performance experienced by the consumer. There are other aspects of broadband
performance that are technically outside the ISP’s network, but that can be affected by the
ISP’s behavior, such as peering arrangements, which are the polices by which an ISP exchanges
traffic with another ISP. In future performance measurements, it will be important to keep in
mind that these arrangements by ISPs—which were not measured in the present study—can
affect broadband performance. The ultimate goal is to know how well consumer broadband
is working in actual use conditions.
This study focused on those elements of the Internet pathway under the direct or indirect
control of a consumer’s ISP on that ISP’s own network: from the consumer gateway—the
modem used by the consumer to access the Internet—to a nearby major Internet gateway
point (from the modem to the Internet gateway in Figure 1, above). This focus aligns with the
broadband service advertised to consumers and allows a direct comparison across broadband
providers of actual performance delivered to the household.
More than 78,000 consumers volunteered to participate in this study and a total of
approximately 9,000 consumers were selected as potential participants and were supplied with
specially configured routers. The data in this Report is based on a statistically selected subset
of those consumers—approximately 6,800 individuals—and the measurements taken in their
homes during March 2011. The participants in the volunteer consumer panel were recruited
with the goal of covering ISPs within the U.S. across all broadband technologies, although only
results from three major technologies—DSL, cable, and fiber-to-the-home—are reflected in
this Report.25 To account for network variances across the United States, volunteers were
recruited from the four Census Regions: Northeast, Midwest, South, and West. Within each
Census Region, consumers were selected to represent broadband performance in three typical
speed ranges: less than 3 Mbps, between 3 and 10 Mbps, greater than 10 Mbps.26
The testing methodology itself required innovation on both the consumer, or “client,” side and
on the ISP, or “server,” side. The server-side infrastructure, which comprised reference
measurement points that were distributed geographically across nine different U.S. locations,
was made available to SamKnows for the project by M-Lab, a non-profit organization that
supports Internet research activities.27 Each consumer participant’s broadband performance
was measured from a hardware gateway in his or her household to the off-net test node that
had the lowest latency to the consumer’s address.
On the “client” side of the test, consumers self-installed a measurement gateway that was
provided by SamKnows. These gateways, or “Whiteboxes,” were installed between the
consumer’s computer and Internet gateway and came pre-loaded with custom testing software.
The “Whitebox” software was programmed to automatically perform a periodic suite of
broadband measurements while excluding the effects of consumer equipment and household
broadband activity. This approach permitted a direct measure of the broadband service an ISP
delivered to a consumer’s household.
The participating ISPs also volunteered to establish two kinds of additional reference
measurement points within their own networks. Some ISPs installed a measurement reference
point within their networks at a major peering facility, which represented the mirror image of
the SamKnows peering reference points. These reference points served as a validity check and
verified that the SamKnows measurements were not significantly affected by peering
relationships or other network degradations. Some ISPs also installed measurement points at
various ISP interior network points that did not correspond to the M-Lab peering locations.
These reference points were principally intended to test for performance degradations caused
by bandwidth limitations in “middle mile” segments,28 or for effects caused by the specific
design of the network. Test results demonstrated that measurements from all ISP-installed
reference points, regardless of location, agreed closely with the results from the M-Lab peering
reference measurements, which strengthens confidence in the results. The general
correspondence between results taken from the M-Lab and the independent ISP reference
points also suggests that among the ISPs tested, broadband performance that falls short of
expectations is caused primarily by the segment of an ISP’s network from the consumer
gateway to the ISP’s core network.29 The results contained in this Report are based on the
measurements obtained from the M-Lab peering reference points only, while the raw bulk data
set contains all results, including those from the ISP-only reference points.
communications found in computing applications, latency is also additive, which means that
the delay caused by the sum of a series of latencies adds to the time it takes to complete a
computing process. Thus, latency can have a significant effect on the performance of
applications running across a computer network. As service speeds increase, the impact of
network latency can become more noticeable, and have a more significant impact on overall
performance.
One of the key factors that affects all aspects of broadband performance is the time of day. At
peak hours, designated for the purpose of this study as between 7:00 pm and 11:00 pm local
time on weeknights, more people are attempting to use the Internet simultaneously, giving rise
to a greater potential for congestion and degraded user performance. Unless otherwise noted,
this Report concentrates on performance during peak hours as the period of highest interest to
the consumer, while results for 24-hour averages and weekend performance are included in the
Appendix.
This Report highlights the results of the following tests of broadband speed and latency, as
measured on a national basis across DSL-, cable-, and fiber-to-the-home technologies:
· Sustained download speed: throughput in Mbps utilizing three concurrent
TCP connections measured at the 25-30 second interval of a sustained transfer
· Sustained upload speed: throughput in Mbps utilizing three concurrent TCP
connections measured at the 25-30 second interval of a sustained transfer
· Burst download speed: throughput in Mbps utilizing three concurrent TCP
connections measured at the 0-5 second interval of a sustained transfer
· Burst upload speed: throughput in Mbps utilizing three concurrent TCP
connections measured at the 0-5 second interval of a sustained transfer
· UDP latency: average round trip time for a series of randomly transmitted
user datagram protocol (UDP) packets distributed over a long timeframe
Summary of Findings
We present the summary of our findings below. As noted earlier, the full results of all 13 tests
from March 2011 are available at http://www.data.fcc.gov/download/measuring-broadband-
america/statistical-averages-2011.xls. The Commission is separately releasing a validated33 set
of the data on which this Report was based, together with a non-validated data set covering the
period from February 2011 through June 2011. The results below are reported by
performance variation by ISP and by technology (DSL, cable, and fiber-to-the-home), for the
most popular service tiers offered by each ISP. We focus on peak periods since this is the
period of time when service performance is likely to suffer and it is also the period of highest
utilization by the average consumer. As a final note, the results presented below represent
average34 measured performance across a range of consumers, and while these results are
useful for comparison purposes, they should not be taken as an indicator of performance for
any specific consumer.
Chart 1 shows average download performance over a 24-hour period and during peak periods
across all ISPs. Most ISPs delivered actual download speeds within 20% of advertised speeds,
with modest performance declines during peak periods.35
As shown in Chart 2, upload performance is much less affected than download performance
during peak periods. Almost all ISPs reach 90 percent or above of their advertised rate, even
during peak periods.
Chart 1: Average peak period and 24-hour sustained download speeds as a percentage of
advertised, by provider
Actual Download / 24-hr Mon-Sun
Advertised Download
7pm-11pm Mon-Fri
Speed (%)
140%
120%
100%
80%
60%
40%
20%
0%
ht
st
om
r
am
&T
)
k
r
on
SL
er
es
te
tie
ne
Co
in
ca
sig
isi
ib
AT
ar
(D
yL
Qw
tre
iac
ar
on
m
ev
(F
In
Ch
ur
eW
on
Co
ed
ds
Fr
bl
on
nt
in
riz
M
m
Ca
Ce
riz
W
Ve
Ti
Ve
Chart 2: Average peak period and 24-hour sustained upload speeds as a percentage of
advertised, by provider
Actual Upload / 24-hr Mon-Sun
Advertised Upload 7pm-11pm Mon-Fri
Speed (%)
140%
120%
100%
80%
60%
40%
20%
0%
m
)
st
am
)
&T
on
ht
r
r
k
t
x
SL
er
tie
es
te
ne
Co
in
ca
co
sig
isi
ib
(D
AT
ar
yL
Qw
tre
ar
on
m
ia
ev
(F
In
Ch
ur
eW
on
Co
ed
Fr
ds
bl
on
nt
riz
in
M
m
Ca
Ce
riz
W
Ve
Ti
Ve
In general, we found that even during peak periods, the majority of ISPs were providing actual
speeds that were generally 80 percent or better than advertised rates, though there was
considerable variation among the ISPs tested, as shown in Chart 3. As noted previously,
performance was also found to vary by technology. Results from a particular company may
include different technology platforms (e.g., results for Cox include both their DOCSIS 2.0 and
DOCSIS 3.0 cable technologies; results for AT&T include both DSL and U-Verse36).
Chart 3: Average peak period sustained download and upload speeds as a percentage of
advertised, by provider
Actual / Sustained Download
Advertised Speed (%) Sustained Upload
140%
120%
100%
80%
60%
40%
20%
0%
am
)
)
m
k
r
on
st
r
r
&T
t
ht
x
SL
er
ne
es
te
tie
in
Co
ca
co
sig
isi
ib
(D
AT
ar
yL
Qw
tre
ar
on
m
ia
(F
ev
In
Ch
ur
eW
on
ds
Co
ed
Fr
on
bl
nt
in
riz
M
m
Ca
Ce
riz
W
Ve
Ti
Ve
Chart 4: Average peak period sustained download and upload speeds as a percentage of
advertised, by technology
140%
120%
100%
80%
60%
40%
20%
0%
Cable DSL Fiber
Actual / Advertised
Speed (%)
160%
140%
AT&T
Cablevision
CenturyLink
Charter
120% Comcast
Cox
Frontier
Insight
100% Mediacom
Qwest
TimeWarner
Verizon (DSL)
Verizon (Fiber)
80% Windstream
Note:
60% All speed tiers on the x-axis
represented as categories,
not continuous variables.
Throughout the report,
40% technologies are denoted
0.77 1 1.5 2 3 5 6 7 10 12 15 16 18 20 22 24 25 30 35 by different shapes:
Advertised Download Speed (Mbps) DSL
Cable
Fiber
160%
AT&T
Cablevision
140% CenturyLink
Charter
Comcast
120% Cox
Frontier
Insight
Mediacom
100% Qwest
TimeWarner
Verizon (DSL)
Verizon (Fiber)
80% Windstream
60%
40%
0.128 0.256 0.384 0.512 0.640 0.768 0.896 1.0 1.5 2 3 4 5 15 25 35
Advertised Speed (Mbps)
140%
AT&T
Cablevision
120% CenturyLink
Charter
Comcast
Cox
Frontier
100% Insight
Mediacom
Qwest
TimeWarner
80% Verizon (DSL)
Verizon (Fiber)
Windstream
60%
40%
0.77 1 1.5 2 3 5 6 7 10 12 15 16 18 20 22 24 25 30 35
Advertised Speed (Mbps)
Chart 8: Average peak period burst upload speed as a percentage of advertised speed, by
provider
Actual / Advertised
Speed (%)
240%
220%
200%
AT&T
180% Cablevision
CenturyLink
Charter
160% Comcast
Cox
Frontier
140% Insight
Mediacom
120% Qwest
TimeWarner
Verizon (DSL)
100% Verizon (Fiber)
Windstream
80%
60%
40%
0.128 0.256 0.384 0.512 0.64 0.768 0.896 1 1.5 2 3 4 5 15 25 35
Advertised Speed (Mbps)
The use of transient performance boosting features such as PowerBoost is less prevalent for
upstream connections. The test results found marked improvement in burst upload speeds on
some but not all service tiers, suggesting that PowerBoost might be applied to upstream
performance by at least one or more ISPs. For example, in Chart 8, Cox and Comcast achieve
average rates in the range of 130 percent to over 200 percent in service tiers ranging from 1
Mbps to 2 Mbps.
Latency
As can be seen from Chart 9,39 latency varies by technology and by service tier.40 Fiber-to-the-
home has the best performance in terms of latency, with a 17 ms average during the peak
period, cable averages 28 ms, and DSL averages 44 ms and ranges as high as approximately 75
ms.41 Although the test results found variance in latencies among technologies, all of the
latencies measured here should be adequate for common Internet applications such as VoIP.
70
60
50
DSL
Average DSL
40
Cable
Average Cable
Fiber
30
Average Fiber
20
10
0
0.77 1.0 1.5 2.0 3.0 5.0 6.0 7.0 10 12 15 16 18 20 22 24 25 30 35 50
Advertised Speed (Mbps)
Chart 10 displays average web page loading42 time by speed tier. Web pages load much faster
as broadband speed increases, but beyond 10 Mbps, performance increases for basic web
browsing are slight. The data indicate that a consumer subscribing to a 10 Mbps speed tier is
unlikely to experience a significant performance increase in basic web browsing—i.e., accessing
web pages, but not streaming video or using other high-bandwidth applications such as video
chat—by moving to a higher speed tier.
9000
AT&T
8000 Cablevision
CenturyLink
7000 Charter
Comcast
Cox
6000 Frontier
Insight
5000 Mediacom
Qwest
TimeWarner
4000 Verizon (DSL)
Verizon (FiOS)
3000 Windstream
2000
1000
0
0.77 1.0 1.5 2.0 3.0 5.0 6.0 7.0 10 12 15 16 18 20 22 24 25 30 35
Advertised Speed (Mbps)
Chart 11 shows that performance during the day is not consistent for most technologies.
During idle periods there is more capacity available for the consumer, while at peak periods,
with many consumers online, available capacity per consumer diminishes. As shown in Chart
12, on average DSL and cable provide similar performance over this period, but results differ
significantly among different ISPs.
Chart 11: Average sustained download speeds as a percentage of advertised over a 24-hour
period, by provider
Actual Download /
Advertised Download
Speed (%)
120%
50%
AM
AM
PM
AM
PM
PM
M
M
M
PM
AM
0A
0P
2P
2A
-4
-6
-8
-2
-4
-6
-2
-8
-1
-1
-1
-1
M
M
PM
M
M
M
AM
AM
PM
M
2P
4P
6P
2A
4A
6A
8P
8A
12
12
10
10
Time of day
Chart 12: Hourly average sustained download speeds as a percentage of advertised speed, by
technology
Actual /
Advertised (%)
120%
110% Fiber
Cable
DSL
100%
90%
80%
PM
PM
M
PM
AM
AM
AM
M
AM
M
PM
0P
0A
2P
2A
-6
-2
-4
-8
-6
-8
-4
-2
-1
-1
-1
-1
M
PM
M
M
M
AM
M
AM
PM
M
4P
6P
2P
4A
6A
2A
8P
8A
12
12
10
Time of day 10
The table below lists the advertised speed tiers included in this study, and compares this with
the actual average peak performance results from March 2011. As before, we note that the
actual sustained download speeds here were based on national averages, and should not be
taken to represent the performance experienced by any one consumer in any specific market
for these ISPs.
Advertised
Actual Sustained Actual Sustained Speed
Speed Tier Provider
Speed (Mbps) / Advertised Speed Tier
(Mbps)
Acknowledgements
This Report benefited from the voluntary participation of a number of parties. The
contribution of their expertise to the development of the methodologies employed in this
Report materially increased its quality. We would like to extend our thanks to the
following entities:
· Adtran
· AT&T
· Cablevision Systems Corporation
· CenturyLink
· Charter Communications
· Comcast
· Corning
· Cox Communications
· Frontier Communications Company
· Georgia Institute of Technology
· Insight Communications
· Intel
· Mediacom Communications Corporation
· Massachusetts Institute of Technology
· M-Lab
· National Cable & Telecommunications Association
· New America Foundation
· Qwest Communications
· Time Warner Cable
· US Telecom Association
· Verizon
· Windstream Communications
Finally, we would like to thank SamKnows for their extraordinary performance during
this endeavor. Their experience and hard work on this project was critical to its success.
ENDNOTES
http://www.broadband.gov/plan/ (“NBP”).
4 See
http://www.broadband.gov/qualitytest/?TB_iframe=true&height=500&width=4
70&qt=true (last accessed July 8, 2011).
5 The Consumer Broadband Test allows consumers to choose between testing
performed by one of two entities: Ookla and M-Lab. However, these tests use
different upload and download criteria, and so can return different results for the
same broadband connection.
6 Consumer Information and Disclosure; Truth-in-Billing and Billing Format; IP-Enabled
FiOS. Other services use fiber optic technology and may be marketed as “fiber.”
See, e.g., note 36 infra.
10 Participating ISPs were: AT&T (DSL); Cablevision (cable); CenturyLink (DSL);
Charter (cable); Comcast (cable); Cox (cable); Frontier (DSL); Mediacom (cable);
Insight (cable); Qwest (DSL); TimeWarner (cable); Verizon (DSL and fiber-to-the-
home); and Windstream (DSL).
11 As described more fully in the Appendix, this study allowed for a target
deployment in up to 10,000 homes across the United States, and the final volunteer
pool was created from over 75,000 initial volunteer broadband subscribers.
Although test results were taken from 7,500 households over the course of the
study, the results that are analyzed in this Report reflect broadband performance
to 6,800 homes during the month of March 2011.
12 In addition to the various data sets, the actual software code that was used for
the testing will be made available for academic and other researchers for non-
commercial purposes. See infra note 43.
13 ISPs typically quote speeds or information rates in units of Megabits (millions of
bits) per second, known as Mbps. A bit is the basic unit of information in
computing.
14 Sustained speeds are described in the Appendix and are averaged over five
second intervals across the high and low rates that might dynamically occur in
very short time interval measurements.
15 Those earlier reports of broadband speeds in the United States and the United
somewhat slower and more subject to latency than cable and fiber-to-the-home,
DSL is generally less expensive than either of other technologies discussed in this
Report, which could be a considerable benefit to some consumers, and a significant
factor in their choice of broadband provider.
18 A 24-hour average was computed each day and then averaged over Monday
through Sunday.
19 In this context, the closest server is the measurement server providing minimum
round-trip time.
20 For example, downloading a large file while browsing the web would limit the
effectiveness of PowerBoost.
21 ISPs typically advertise a smaller number of speed tiers but must support legacy
tiers—tiers promoted at one time but no longer offered for new subscription—until
they are migrated to higher speeds. During deliberations with ISPs for this trial,
some noted that they maintain a larger number of service tiers than they currently
promote and advertise and that they may support as many as ten service tiers at a
given time.
22 This was a result of the limited number of white boxes—approximately 9,000—
that could be deployed over the course of the project. Region-specific data would
have required an order of magnitude or greater deployment of equipment, at a
corresponding increase in cost.
Lab, ADTRAN, Corning, Fiber to the Home Council, Georgia Tech, Intel, MIT,
Motorola, National Cable Television Association, the New America Foundation,
and the US Telecom Association.
25 An initial goal for the project included measurements for satellite and fixed
categorized in the “Form 477” reports that the Commission uses as its primary tool
for collecting data about broadband networks and services. See Modernizing the
FCC Form 477 Data Program, Notice of Proposed Rulemaking, 26 FCC Rcd 1508,
1512 n.27 (2011), citing Development of Nationwide Broadband Data to Evaluate
Reasonable and Timely Deployment of Advanced Services to All Americans, Improvement
of Wireless Broadband Subscribership Data, and Development of Data on Interconnected
Voice over Internet Protocol (VoIP) Subscribership, Report and Order and Further
Notice of Proposed Rulemaking, 23 FCC Rcd 9691, 9700-01 (2008).
27 M-Lab is a non-profit corporation supporting research on broadband networks
data communications from the central office, cable headend, or wireless switching
station to an Internet point of presence. In contrast, “last mile” refers to the
connection between a consumer’s home and that consumer’s broadband service
provider’s first aggregation point.
29 A simple view of an ISP’s network from the consumer perspective is that it
tool used to measure the latency. The measurement methodology used in this
Report differs slightly from that tool, but the results should be essentially the same.
32 See International Telecommunication Union (ITU), Series G: Transmission Systems
and Media, Digital Systems and Networks; International Telephone Connections and
Circuits—General Recommendations on the Transmission Quality for an Entire
International Telephone Connection, G.114 (May 2003).
33 The March 2011 data set was validated to remove anomalies which would have
produced errors in the Report. This data validation process is described in the
Appendix.
34 For a discussion of how averages were calculated for the purposes of this Report,
Qwest. These two entities completed a merger on April 1, 2011; however, during
the testing in March 2011, they were separate companies.
36 U-Verse is a service mark offering of AT&T supporting a bundled service
unweighted arithmetic averages of the relevant data sets. However, the sample
plan was based on market share data for all ISPs. Comparison of unweighted
averages with averages weighted by market share showed close agreement.
38 Only 10 out of 53 service tiers tested in this study returned less than 80 percent of
provide results for individual ISPs. This is a result of filtering for low sample
counts. While a 50 Mbps service tier is offered by some of the ISPs included in the
study, our survey did not obtain enough samples to include this service tier in
results for individual ISPs. However, when aggregated by technology, the number
of independent samples for 50 Mbps exceeded our threshold criteria for the study.
40 We provide latency figures for peak periods. As noted earlier, latency during
peak periods was seen to increase by about 7.6% across all technologies. Latencies
america/statistical-averages.xls.
42 For a definition of web loading time, see Appendix at 24-25.
43 To apply for non-commercial review of the code, interested parties may contact