0% found this document useful (0 votes)
110 views7 pages

Benchmarking O-RAN Performance: White Paper

ORAN intro by VIAVI

Uploaded by

prakashnpk
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
110 views7 pages

Benchmarking O-RAN Performance: White Paper

ORAN intro by VIAVI

Uploaded by

prakashnpk
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

VIAVI Solutions

White Paper

Benchmarking O-RAN
Performance

Introduction to O-RAN
2020 was a good year for O-RAN. We saw a number of collaborations between telcos and vendors to develop
O-RAN technologies in 4G and 5G. The O-RAN ALLIANCE (in which VIAVI plays an active role), and the Open
RAN Policy Coalition both welcomed new members. The latter also rounded off the year with a roadmap
‘outlining the steps that governments and policymakers can take to accelerate adoption of open standards in
the RAN and promote a network ecosystem that is more diverse, competitive and secure’.

We at VIAVI also made great strides. We took part in the Joint European O-RAN and TIP Plugfest 2020, where
together with EANTC, we demonstrated automated end-to-end (E2E) O-RAN tests at the Open Test and
Integration Center (OTIC). This included testing multi-vendor pairings.

Also in 2020, a number of major vendors came out and publicly backed O-RAN, something that would have
been unheard of a few years ago. O-RAN may require an adaptation of vendors’ traditional business models,
but it won’t impact the fundamental relationship between vendors and CSPs. Vendors have always leveraged
the very tight integration of networks components to deliver high performance. Now, O-RAN is creating
new paths for mixing and matching best in breed components, to achieve each CSP’s network objectives and
ensure that traditional vendors remain a hugely valuable element of the ecosystem. It’s clear that the whole
telco industry has now begun to acknowledge the benefits of O-RAN.

These include enabling an open, multi-vendor, interoperable ecosystem and new opportunities for innovation;
helping to drive down the cost of RAN equipment; enabling automation which can reduce deployment and
management costs; and scale and agility, where network components implemented as software functions can
be scaled to meet network capability and capacity demands.

Finally, 2020 saw the rapid acceleration of the digitalization of all industries on a global scale – something
which will continue this year as the world settles into a new(ish) and unpredictable ‘normal’. There remains a
real need for organizations to be able to work together virtually, highlighting our continued reliance on digital
services. This includes the need for reliable and efficient mobile wireless networks. They must support an
increasingly diverse range of use cases, including those associated with critical communications infrastructure.
O-RAN may be capable of delivering all of this, but we’re not quite there yet. Due to its novelty and recognizing
that multiple vendors are involved, testing and maintaining the resilience and performance of the entire network
chain is crucial.

O-RAN made substantial progress in 2020, and during 2021 has the potential to become a mainstream technology.
There’s already been a lot of positive chatter in the media and the telecoms community. In the first weeks of
January, for example, Deutsche Telekom, Orange, Telefonica, and Vodafone signed a memorandum of understanding
outlining their commitment to create a framework for the creation of a European Open RAN ecosystem.

The move to O-RAN now seems inevitable, meaning a ‘do or die’ mentality is gaining momentum across the
industry. Business models will be re-defined and new opportunities will be unlocked – for CSPs and vendors. This
year, we’ll see O-RAN network developments and rollouts as well as a continuation of the great work achieved at
last year’s Plugfest.

VIAVI has been enabling NEMs and operators to test and validate new O-RAN equipment and networks, meeting
varied customer requirements. Our leading position in testing this new RAN architecture has been achieved
thanks to our history and expertise working with NEMs and operators, and in defining benchmarks for how high-
performance networks must operate. There is much scope to re-use benchmarking tests developed for 5G and 4G
RAN that would benefit O-RAN benchmarking scenarios that involve 4G-5G interactions.

Benchmarking, and learning from experience


NEMs have spent years (and a lot of money) developing traditional RAN infrastructure. A large part of their
business has focussed on developing base stations in which all components – the CU, DU etc. – are in the same
piece of hardware. Applications and use cases were broadly similar, so testing was done against the benchmark of
these traditional pieces of kit.

Now, the situation is changing. The market is opening up, architectures are being virtualized, and there’s an ongoing
need to incorporate new 3GPP features. It is understandable that many NEMs will find it challenging to transition
to this new and more open ecosystem.

Disaggregation and opening the market up to more vendors requires integrating elements and testing with
different players in the ecosystem. This is very different from the traditional RAN testing scenario. This involved just
one vendor which would quickly and easily run test cases through a carefully managed software delivery process.

With network components disaggregated and connected with open interfaces, it can now be a lot more difficult to
get everything tested and working together, end-to-end. The number of players involved in the process creates a
further difficulty: that of accountability. If an issue is identified during testing, whose fault is it? Playing the blame
game will not advance O-RAN and virtualization – quite the opposite.

It is therefore key for systems integrators (SIs) and all others involved in end-to-end testing, to provide visibility
into the disaggregated networks and devise means of resolving issues. This will help ensure performance KPIs are
met, and that individual network elements meet the same standard or better than pre-O-RAN architectures.

2 Benchmarking O-RAN Performance


Understanding the complexity of O-RAN networks
As mobile data traffic continues to grow, multiple points of aggregation including the O-CU, O-DU and
cascaded O-RU performance need to be validated. This is critical for operators to meet 5G service promises.

FUNCTIONAL / PERFORMANCE / CAPACITY TESTING / TROUBLESHOOTING & ASSURANCE

O-RAN

Fronthaul Midhaul Backhaul


4.5G RRH
4G/5G Core
4.5G C-RAN Network Interfaces

SMO/RIC Xn S1
O1, O2, E2 X2

Option 7.2x F1 Interface N2 EPC / NGC


O-RAN/eCPRI (Ethernet)
5G O-RU 5G O-DU 5G O-CU

Radio Fiber Access Transport/Sync Data Center & Core network


According to O-RAN specifications, the gNodeB is disaggregated into three distinctInterconnects
and standardized elements:
LAB
the O-RU, O-DU and O-CU from potentially different manufacturers, with the F1 interface between the O-DU
COMPLETE SET OF O-RAN SUBSYSTEM TESTING EMULATORS
and O-CU. Service providers and NEMs will need to validate that these components
TM500 Family
work together and meet
TeraVM Family
both O-RAN and 5G performance
VSA requirements, so interoperability testing needs to be an essential part of any

test strategy. VSG

End User Simulation O-DU Emulation O-RU Emulation E2 O-DU O-CU RAN Core App
4G and 5G Device and O-RU (OFH) Test O-DU (OFH) Test Emulation Emulation Emulation Emulation Emulation Emulation
It’s one thingemulation
Application to ensure network interoperability, yet it’s another to obtain
RIC Test the O-DU
O-CU (F1) Test performance
Test and guarantee
Core Test RAN Test

throughput, latency and quality measures at the transport layer and the application layer, at scale.
FIELD

For this, using pre-O-RAN testing as a reference point – a benchmark – can help. When we test O-RAN
components end-to-end, it is possible to prove that the throughput, latency, and other KPIs are the same – or
better – than conventional aggregated RAN performance. With over 20 years of working with all the NEMs,
3Z RF Vision Antenna Motion CellAdvisor 5G TBERD/MTS-5800 OneAdvisor-800
VIAVI has accumulated
Antenna Processormassive
(AMP) experience
RF Beam inNetwork
this area.
TesterThere’s no need
All-in-one to totally reinvent the wheel: instead,
cell site installation
Alignment Antenna Motion Analysis X-haul, Latency, & maintenance test tool;
we can re-use test cases,
Sensing
meaning new entrants don’t have to fiber,
Timing & Sync testing
startcoax,
from scratch, and they can learn from and
and RF

ASSURANCE
leverage established benchmarks.

NITRO Mobile NITRO Transport Fusion NITRO Mobile


Performance across the transport and
Optimization and Automated
GEO location Virtual application
service activation andlayer
Transport network xHAUL
PM
Assurance

Since the O-RU is a network entity, it is necessary to ensure that performance is acceptable at both the
transport and application layers. RF performance and interoperability performance with the O-DU should
remain aligned with the cost and performance requirements of the network. Synchronization and timing issues
at the O-RAN fronthaul layer should not affect any applications, especially latency-sensitive applications such
as voice over NR. Again, it’s understandable that NEMs may be very sensitive about performance as they have
invested much in optimizing the RU performance and this would serve as the baseline for evaluating the O-RU.

Another aspect that NEMs and operators must consider is conformance testing. This involves performing
specific tests according to O-RAN specifications, e.g. conformance testing of the O-RU at the transport layer.
However, conformance testing alone is not enough. Think of it like testing white goods to check that they
conform to industry standards. You do the necessary checks, the product passes, and you stamp it with a ‘CE’
mark. All good? Not quite. You then try and switch it on and although it meets the required specs, the product
doesn’t actually work.

It’s a similar story with conformance testing. Infrastructure might meet certain standards when elements are
tested independently, but when it comes to interoperability, there’s no way of knowing whether what you’re
3 Benchmarking O-RAN Performance
testing will function as part of an end-to-end architecture, alongside other elements. Different NEMs and
operators will have different requirements in terms of the functionality of the O-DU e.g. different frequency
bands, MIMO schemes etc. So infrastructure must be tested to ensure interoperability with these requirements.

While conformance testing is important, NEMs and operators must go beyond that. Equipment needs to be
tested end-to-end, plugged into different O-DUs, and with different vendor equipment (and different vendor
requirements).

For NEMs, the greater scope of components means that it is important to select scalable test tools that can
support not just conformance testing but can be upgraded to do performance tests at the higher layers e.g. the
5G stack. This will support a range of real-world test scenarios including large number of UEs, mobility, mixed
traffic types and maximum data rates.

Testing different customer requirements


Another objective of benchmarking is evaluating the sheer number of combinations and permutations of
different customer requirements. These include testing at different frequency bands and MIMO schemes to
ensure that the performance from a throughput perspective is the same or better than pre-O-RAN. Performance
compromises can affect the user QoS if the throughput is not what is expected. When you add mobility, things
get even more complex as you start scaling the number of users in a system.

End-to-end benchmark testing at scale ensures the overall throughput for different capacity and mobility use
cases is as good or better than with traditional RAN.

Interaction with legacy technologies


The benchmark testing should include interactions with legacy technologies such as 4/4.5G which will be a
reality when users move into a poor 5G coverage area where 4G is stronger. Adding to this the new 4G UE
types, and scaling this up ensures that 5G users get the best QoS, and that there is no major impact on quality
of service for 4G users.

To get this efficiency at scale – and to benefit from reduced costs – it’s key that SIs and others involved
in doing end-to-end testing, can identify the cause of issues resulting from software updates in any of
the network elements. Upgrading the eNB and gNB software and providing bug fixes present operational
challenges – especially in such a dynamic environment with multiple vendors as play. As such, it is essential for
operators to be confident that they’re continuing to deliver the best QoS for their subscribers.

Network disaggregation
O-RAN is introducing standardized open interfaces, enabling disaggregated network elements and the use of
COTS hardware. As a result, you may have to benchmark and test each individual component independently.
The benefit of doing this individual element testing properly is that it ensures that when everything is joined
up, everything works without compromising the performance. The alternative is discovering issues only later on,
when they are more difficult to isolate, and potentially more expensive to address.

Latency is one such example. It may be quite tedious to pinpoint latency issues from the O-CU and this may
only be apparent when you ramp up the number of C-plane or U-plane messages. You may also find out that
for a 4x4 MIMO scheme, the throughput appears to be similar to a 2x2 MIMO scheme because of an increased
number of retransmissions which may not be straightforward to isolate.
4 Benchmarking O-RAN Performance
Preserving or improving on KPIs
Since the network elements are from different vendors, it is important to ensure KPIs are preserved at both an
element level and performance level when they all interoperate. The Open Test and Integration Centre (OTIC)
labs will help this cause, but what’s really needed is experience in testing traditional base stations, as many of
the performance measures will still apply and therefore serve as important benchmarks. The test cases should be
designed using current KPI measures as a reference. Again, this is hardly re-inventing the wheel; instead it ensures
efficiency in re-using existing test cases particularly for end-to-end testing.

Supporting vendors with a narrower scope of experience


O-RAN testing may present challenges which are more specific to new providers. Although many will be coming
from adjacent spaces such as DAS and small cell, the broader set of network use cases and expected KPIs involved in
this new market may still cause initial difficulties. New players may also have a narrower scope of expertise.

However, as they will be working together with other vendors, it’s unlikely that new entrants will be at a significant
disadvantage. In addition – and conversely – they may also bring with them new, innovative and more efficient
software and hardware architectures that promote greater flexibility and address the issue of cost and scale. Ensuring
that these new innovations do not negatively impact key KPIs is key to successful benchmarking.

To help O-RAN equipment interoperability, the OTIC labs need to validate interactions between disaggregated
5G access infrastructure providers. However, it’s not just a case of testing the interoperability of distinct vendor
technology as part of an O-RAN set-up. It is in the best interest of operators to see how this technology interacts
with legacy 4G equipment in the network, and how it responds to different UE environments e.g. handover between
4G and 5G.

Operators must also be able to identify, isolate and resolve network performance issues before network deployment.
They need a test portfolio that offers emulation of many components of the network including UE, RAN and Core,
and wrap-around of major functions over multiple interfaces. There are a number of ways to test in the lab, including:

y Multi-vendor interoperability testing, which ensures that the mix of components will work functionally, and test
for performance, resilience and stability. When new features are added, benchmarking ensures that although it may
require more processing power, the benchmarking KPI measures are preserved or improved.

y Subsystem (wrap-around) testing: this could be highly valuable for specialist vendors that are targeting a niche
market for their offering but don’t have end-to-end capabilities in-house.

y Holistic evaluation of multiple RAN deployment options (RAN disaggregation topologies for eMBB, URLLC,
different etc.).

y Performance monitoring of open interfaces and protocols to ensure optimum operation.

y With benchmarking, you ensure consistent performance at the transport and application layers. Monitoring allows
you to see where the peaks are, where there are bottlenecks and particularly where the KPIs being measured are
reaching very close to the threshold. This can benefit in terms of where to do optimizations e.g. on the user plane
or control plane.

5 Benchmarking O-RAN Performance


TeraVM and TM500 for O-RAN testing
The TeraVM O-CU Test DU Sim Generator is compliant with the 3GPP F1 application protocol and capable of
emulating hundreds of Gbps, thousands of DUs, and millions of devices for meaningful functional and load testing
of the O-CU. Based on one of the first mobile network test platforms to harness the benefits of virtualization, the
F1 Load Generator is a software-based test tool housed on x86 hardware. To increase flexibility and cover a wider
set of customer use cases, additional optional elements of the test suite are available, including 5G Standalone/
Non-Standalone Core Emulator for use cases where a real Core Network is absent, or X2 vRAN for 5G NSA test
use cases.

TeraVM enables NEMs and service providers to efficiently test mobile RAN and core elements, validating that the
equipment works according to O-RAN and 3GPP standards, interoperates with other 5G elements, and performs
optimally when fully loaded with complex mobile traffic profiles. TeraVM is part of the VIAVI Lab To Field network
testing and assurance portfolio as well as its Test Suite for O-RAN Specifications.

FUNCTIONAL / PERFORMANCE / CAPACITY TESTING / TROUBLESHOOTING & ASSURANCE

O-RAN

Fronthaul Midhaul Backhaul


4.5G RRH
4G/5G Core
4.5G C-RAN Network Interfaces

SMO/RIC Xn S1
O1, O2, E2 X2

Option 7.2x F1 Interface N2 EPC / NGC


O-RAN/eCPRI (Ethernet)
5G O-RU 5G O-DU 5G O-CU

Radio Fiber Access Transport/Sync Data Center & Core network


Interconnects
LAB

COMPLETE SET OF O-RAN SUBSYSTEM TESTING EMULATORS


TM500 Family TeraVM Family
VSA

VSG

End User Simulation O-DU Emulation O-RU Emulation E2 O-DU O-CU RAN Core App
4G and 5G Device and O-RU (OFH) Test O-DU (OFH) Test Emulation Emulation Emulation Emulation Emulation Emulation
Application emulation RIC Test O-CU (F1) Test O-DU Test Core Test RAN Test

FIELD

3Z RF Vision Antenna Motion CellAdvisor 5G TBERD/MTS-5800 OneAdvisor-800


Antenna Processor (AMP) RF Beam Network Tester All-in-one cell site installation
Alignment Antenna Motion Analysis X-haul, Latency, & maintenance test tool;
Sensing Timing & Sync testing fiber, coax, and RF

ASSURANCE

NITRO Mobile NITRO Transport Fusion NITRO Mobile


Optimization and Automated Transport network xHAUL Assurance
GEO location Virtual service activation and PM

6 Benchmarking O-RAN Performance


The VIAVI TM500 enables operators to test O-RAN network configurations in a lab environment, before rolling
it out in a live network. Importantly, operators can stress-test new O-RAN technology alongside legacy 4G
technology, helping them understand how the introduction of the technology will impact the entire network.
This can directly help support the aims of OTIC and help ensure that new O-RAN compliant infrastructure can be
effectively deployed into networks and inter-operate with legacy equipment.

In order to instil that all-important confidence with operators, VIAVI works to simplify the complexity of testing
different O-RAN interfaces across the network. Our end-to-end solutions span the full lifecycle and include fully
virtualized first-to-market NFV tools that can run in the cloud or commercial off-the-shelf hardware. We also
provide core test emulation, field-portable instruments, and assurance products that provide visibility to operators
once their network in live and starting to scale.

VIAVI plays an active role in contributing to the development of O-RAN standards. We were also part of the first
global Plugfest, an event conducted to foster adoption of open and interoperable 5G and 4G RAN. We participate
in a number of standards bodies and workgroups which are setting the agenda for the future of O-RAN, including
ITU-T, 3GPP, ONAP, and IEEE. Finally, we’re committed and will continue to invest in multi-vendor interoperability
test efforts, as part of O-RAN Alliance-led initiatives.

Testing – an ongoing process


O-RAN was a major theme for the telecoms sector in 2020, and momentum has continued into 2021. It’ll be
important to remember this year that O-RAN is not a single, fixed architecture, and options for deployment will
differ depending on operators’ strategies, business objectives, the services they want to deploy, the technology
they’re looking at, and their level of maturity.

A better approach would be to view O-RAN as a cultural change and a movement toward openness. We’re only at
the early stages of this evolution, and technology, attitudes and approaches are still to mature. In order to continue
on the O-RAN trajectory – and for operators to realize all those hoped-for benefits – confidence, commitment and
cooperation from and between operators is needed.

Cooperation is a crucial element here. Testing will involve integrating different components from different vendors.
Compatibility, as well as consistency and quality of service, must be assured, and while doing so we must avoid
slipping into a ‘blame game’ culture. Plugfest for example, was great example of this. The event, which took part
in December and in which VIAVI played a key role, showcased, “the power of the community to document and
develop truly open interface specifications with the potential to accelerate new 5G advanced services that will be
multi-operator, multi-vendor, and create opportunities for the entire ecosystem,” (Andre Fuetsch, Chairman of the
O-RAN Alliance and Chief Technology Officer of AT&T).

O-RAN development and testing is an ongoing process that will change and develop in line with new features.
VIAVI has a history of supporting NEM and operator partners through decades of network and telcoms evolutions.
By collaborating with leading O-RAN authorities and pioneering the most advanced test equipment, we’re well
suited to support operators in getting the most out of this next phase of network evolution.

Contact Us +1 844 GO VIAVI © 2021 VIAVI Solutions Inc.


(+1 844 468 4284) Product specifications and descriptions in this
document are subject to change without notice.
To reach the VIAVI office nearest you, benchmarkoran-wp-wir-nse-ae
visit viavisolutions.com/contact 30192993 900 0321

viavisolutions.com

You might also like