The Metaverse and Web3
The Metaverse and Web3
The Metaverse and Web3
Looking past the hype and critique, Web3 and the metaverse are shaping a new application layer for the
internet. How can leaders better understand this evolution and what it means for businesses,
organizations, and society?
Moving further into a world that blends the physical and digital may require greater
integration, more modern standards and protocols, and capabilities that give people
more control of their digital selves—their identity and representation, what they
own, and who has access to the data they create. Just as we use standardized
protocols and devices to interact with digital experiences on the internet, the
promises of Web3 could help enable consistency and interoperability across
metaverse experiences—uniting disparate, disconnected metaverses into a single
coherent platform. Similar to the web, there can be walled gardens and open
commons and any number of creative innovations, accessed by browsers, mobile
apps, AR glasses, VR headsets, and more.
and more.
Amidst the hype and criticism of Web3 and the metaverse, leaders should work to cut
through the noise by exploring and experimenting with the underlying solutions. If
they wait too long, the landscape could shift underneath them. With a new wave of
metaverse and Web3-native disruptors, the future of the internet could be shaped by a
grassroots movement that directly challenges Web2-era business models. As
Deloitte’s Eamonn Kelly and Jason Girzadas have shown, technologies like Web3 and
the metaverse are poised to drive revolutionary advancement and technological
breakthroughs which could shape new forms of communication, innovation,
prototyping, and community formation – and new business opportunities.2 The
metaverse and Web3 are powerful ‘winds of change’ that will likely propel us into
new and challenging future ‘worlds’ that business leaders will navigate. To reap the
benefits of this new internet platform, businesses should collaborate to build its
foundation through metaverse and Web3 initiatives.
The metaverse is emerging from our
digital behaviors
Today, people often socialize through global networks, modifying their appearance
with augmented reality filters that can see and respond to faces, wrapping themselves
in avatars, buying virtual clothes, and attending concerts in massive game
worlds.3 Businesses hold face-to-face meetings remotely and virtually, and can don
VR and AR headsets to train, visualize, and collaborate. Generation Z ranks playing
video games as their favorite form of media and entertainment.4 Even before the
nonfungible token (NFT) boom, some game worlds were running out of virtual real
estate to meet the demand of players.5 Remarkably, one poll found that over half of
people in nine markets prefer to spend their time online rather than in the real
world.6 Several generations have become accustomed to digital interfaces, software
and interactivity, and global connectivity. We immerse into the digital and
increasingly draw it out across the physical world. Amidst the hype and critique, it is
these behaviors and uses that are invoking Web3 and the metaverse.
Many technology, media, and telecom (TMT) companies have enabled and responded
to these accelerating behaviors, laying the foundation for an interoperable metaverse.
Telecommunications providers have extended advanced connectivity into businesses,
households, and hands, enabling more interaction, immersion, and collaboration.
Technology providers have established hyperscale platforms that have greatly
empowered innovation and operations, while delivering new generations of hardware
that continue to support and amplify larger and more complex tasks. Media and
entertainment companies have leveraged technology and telecommunications to
advance content and storytelling towards richer, more interactive, and more social
experiences to audiences around the world.
Many of these capabilities need more room to grow, could be rearchitected to better
meet our evolving uses and enable next-generation capabilities, and require
addressing the side-effects that have emerged with scale. There is a desire to enable
much larger shared immersive experiences; to enable an economy of digital goods and
ownership; and to establish digital identity solutions that give more control to users. In
essence, Web3 and the metaverse could bring these capabilities to the internet.
Critically, this layer extends both into the digital internet while enabling digital
information and content to exist in and be drawn from the physical world.7 And it is
being architected with learnings from 40 years of connectivity and digital interactions.
Enabling seamless movement across
the metaverse
One of the more common promises of the metaverse vision is the portability and
interoperability of identity, data, and digital assets. For example, if a consumer buys
an exclusive digital item for their avatar from one service, like a piece of virtual
clothing, they can go to a second service that will recognize their ownership of the
item and render it effectively. For the enterprise, a similar use case might involve
being able to invite users across a partner ecosystem into a shared immersive
collaboration. This might mean reviewing the 3D assembly of a prototype vehicle or
inspecting a digital twin of a factory for performance enhancements.
This capability may require new ways of organizing identity, ownership, and even
storage. Digital identity is often fragmented across services, with multiple logins and
passcodes. Likewise, digital assets such as personal avatars and virtual clothing are
designed to work in the service providing them, not for portability elsewhere.
Currently, only the service knows you “own” the asset, and only the service can load
and render that asset. This concentration of user identity, data, and ownership within a
given service is what is referred to as “centralization”.
Web3 can enable blockchain registries that bind a user’s identity to the things they
own. NFTs offer this ability.8 Instead of being scattered across services, identity could
become a persistent element of the blockchain internet: physically decentralized
across the many computers mirroring and managing the blockchain but logically
centralized as a registry.9 This would enable any service to read the “state” of the user
and the goods they own. Identity, data, avatars, and 3D objects could become portable
across services.
Interoperability of 3D goods is another challenge. There are many 3D modeling
solutions that use different formats with diverse ways of specifying objects and
materials and their behaviors.10 An object that works in one will not typically work in
another. While there are very successful game world marketplaces selling virtual
goods, for example, there is little to no interoperability between them. Enabling
consumer interoperability of digital goods will be critical for the evolution of retail in
the metaverse.11 Users that buy virtual branded sneakers, for example, will likely want
to wear them across different immersive experiences. However, this could require
secure, peer-to-peer storage solutions.12 Although identity and records of ownership
can be secured on a blockchain, the objects themselves—those new virtual sneakers—
are still often stored in common databases.13
Enabling portability and interoperability in the metaverse will likely require
significant effort and collaboration among providers. There are technical challenges,
like increasing the rate of blockchain transactions and reducing energy
requirements.14 It may be harder to shift business models built on user data, but such
efforts could add value to users, their data, content, and digital goods by making them
more persistent and transferable across platforms. Leading businesses have been very
successful and may not see immediate incentives to share control of key assets like
identity and user data. Such considerations tug at the interplay between centralized
and decentralized solutions.
Shifting the balance of power:
Understanding centralization and
decentralization
Web2 has been marked by hyperscale platforms and two-sided marketplaces that have
aggregated very large numbers of users and built their businesses using identity and
user data. Web3 protocols were developed specifically to challenge this control.15
Increasingly, these winners may feel burdened by managing and securing identity and
data, contending with customers and adversarial third parties that have learned to
exploit their services, and reckoning with regulators concerned with market
dominance and consumer protection. A recent Deloitte study found that data and
security were the largest issues causing anxiety for technology executives in the
United States, and that many expect regulation to become much more disruptive over
the next three-to-five years.16 If business leaders are to guide the historic shift to
Web3 and the metaverse, they may need to shed some of their Web2 norms and
embrace more decentralization.
For the metaverse, decentralization applies to the premise that internet users should
control their identity, their data, their digital goods, and more, and that these can move
with them between services and experiences. With Web3, a user’s avatar and digital
goods, for example, can become more easily portable across different services. Or
users can specify different properties for different services. For example, one’s
personal avatar may be very different from one’s avatar for work—just as one’s dress
style may be. Users could use smart contracts that determine how third parties may or
may not interact with their data. In this model, businesses can still own and control
metaverse experiences, but they would negotiate for access to users.
This condition has some interesting nuances. Businesses may give up direct control of
user identity and data but could issue tokens that incentivize users, offer fractional
micropayments in exchange for contributions to the service, and give them a stake in
the success of the business. Users could be similarly incentivized to share their data
and accept advertising. Consumer and enterprise businesses could potentially
implement rules to enforce appearance and behaviors within shared immersive spaces:
a form of virtual-geofencing. Overall trust in digital systems could become much
more robust with decentralized identifiers (DID), peer-to-peer storage systems, and
secure and compliant third-party data trusts.17
Today’s metaverse and Web3
disruptors
While leading platforms and service providers increasingly bend their strategies
towards the metaverse, more nimble disruptors are drawing large amounts of capital
to build the next generation of Web3-native metaverse experiences. These young
businesses use Web3 protocols to create strong networks of users and owners all
incentivized toward shared outcomes. They’re issuing tokens, generating funds and
membership through NFT virtual goods, and building immersive and interactive
virtual worlds that offer entertainment, goods, and equity. Many are built on the
Ethereum blockchain, enabling portability and interoperability across services.
Arguably, the early advancement of the metaverse has been hampered by too much
hype and critique, unclear definitions, and a tendency to insist on VR and AR as a
precondition. Like the web, the interface to the metaverse should be device-agnostic.
With extensive use cases on mobile devices, for example, augmented reality has
advanced without having achieved adoption of AR glasses.
Leaders should examine their businesses and customers, looking for areas where
Web3 protocols can create unique and compelling experiences, enable greater
efficiency, and address regulatory pressures. Some may be able to productize virtual
goods, extend their brands, or offer enterprise services through metaverse experiences.
Some may need to build out their networks, cloud, compute, and storage capacity,
while adding the talent necessary to execute on these next-generation capabilities. 30 If
data is being generated exponentially by metaverse interactions, businesses may need
new solutions in place to operate on that data continuously and effectively, while also
adhering to evolving regulatory and compliance regimes. Additionally, businesses
could leverage cryptocurrency and smart contracts to manage finances with much
greater velocity, essentially automating capital and making their money
programmable.31
Businesses may also be burdened by managing and securing user identity, reckoning
with the complexity of so much data, and struggling to turn it into value. As
threatening as it may seem, many leading businesses could be freed up by letting go of
old approaches and working with partner ecosystems to build data management in a
more holistic and agile way. To do so, business leaders will likely need to align more
on standardization and interoperability that support entire markets—and communities
—beyond the existing market leaders.
However, leaders should also carefully consider the pace of adoption and growth,
especially for immersive experiences. Some things may move quickly, and others will
take time. Business leaders should seek to understand how their capabilities and
mission enable them to build in the near term, plan for the midterm, and prepare for
longer horizons. Experimenting with Web3 and metaverse solutions for today’s
problems can lay the foundations for new business models.
There is much more to understand. What changes will come with regulation, and how
will use cases impact networks, semiconductors, software, and consumer devices?
How might media and entertainment evolve? What is the role of artificial intelligence
in amplifying these capabilities, and what is the future of risk and cybersecurity?
Ultimately, this big shift responds to the demands of people, business, and technology
to establish the next foundation for progress. However, such tectonic changes should
be approached thoughtfully, with more societal considerations beyond the goals of
business and the inertia of technology.
A new dawn for European
chips
Europe ramps up its semiconductor industry to
become more self-sufficient
Foreword
The European Union is mobilising over €43 billion to make itself more self-sufficient in semiconductors
this decade.1 This is a critical objective, but there are multiple paths to greater resilience, each of which
involves significant trade-offs. On which semiconductor technologies should Europe focus? Which parts
of the value chain make the most sense for Europe to develop? If factories are built, where will demand
and talent come from?
Our aim with this report is to educate and inform semiconductor buyers, as well as
investors, governments and regulators. The study explains the multiple types of
players and semiconductors that make up this complex and critical industry, Europe’s
current role in it as well as the one it could have in the years to come. The European
Union’s bold goal is to double its share of global production capacity to 20 per cent by
2030 from ten per cent in 2021.2 As the worldwide semiconductor industry is
expected to double its output by 2030, if the EU were to double its share, it would
need to quadruple its semiconductor output.
Deloitte will look at the possible paths for Europe within the scenario-based
framework developed in 2022 and first published in The Future of the Tech Sector in
Europe.3 The scenarios span from the optimistic but possible, such as European tech
companies achieving trillion-dollar valuations, to the unlikely but not impossible
scenario of failing to create or deploy technology effectively.
One of Europe’s big choices is deciding which generation of semiconductor
technology to focus on. Deloitte believes leading-edge semiconductors will be
important in the future, but chips made through older processes will remain critical to
multiple core European industries. These include transportation, especially car
manufacturing, health care and factories in general.
A second big choice is to determine which parts of the industry Europe should
prioritise, given that no individual country or region can become fully self-sufficient
in all types of semiconductors and parts of the supply chain by 2030.
Finally, Europe will need to find a balance between supply chain localisation and
supply chain diversification. Not everything needs to be in Europe. Places like Japan,
Singapore or the US are trusted and friendly alternatives. However, although this
diversifies supply outside the current over-concentration in China, South Korea and
Taiwan, they are still far away from Europe. This could have profound supply chain
implications.
Most of our focus in this report will be on the EU and the EU Chips Act, which
includes both direct money from the EU and member states (€11 billion in the Chips
for Europe Initiative), with the balance of €32 billion coming from public-private
partnerships.4 However, there are critical non-EU states still within the European
zone: for the tech and semiconductor ecosystems these are notably Norway, Turkey,
Switzerland, Ukraine and the UK.
The future of the European semiconductor
industry: four possible scenarios
European decision-makers must make many choices to optimise the continent’s
semiconductor industry. Decisions will affect both the suppliers of semis and the
major buyers – among them, key industries such as the automotive sector.
Deloitte has created four scenarios for the wider European technology sector (figure
1).
Were this scenario to persist until 2030, that is with a few specialist companies
concentrated in a handful of Europe’s 44 countries, it may imply that other countries
were unable to nurture companies of a similar scale. Given the time and resources
required to, for example, create, staff and power a fabrication plant, or to develop a
specialist design capability at scale, this scenario is probable in Europe as of the end
of the decade.
In the ‘Cowardly cash cow scenario’, European companies are major buyers and users
of technology created outside the region. As this report mentions, Europe is a net
importer of semis: it consumes about 20 per cent of the global chip supply but
manufactures only about nine per cent.5 Given the scale of growth in the semis
industry and the pace at which new factories can be built, Europe could remain a net
importer of semis as of 2030, despite the best efforts of the EU Chips Act. All inputs
enable value to be created, and as has been demonstrated over the last couple of years,
semis are fundamental to the automotive sector. European car manufacturers need
these components, but supply may not need to come from European plants, as
discussed in this report.
The final scenario, the ‘Tech desert’, describes an unlikely scenario where there is
little technology supply and constraints on its application, principally because of
regulation. This scenario is not currently applicable anywhere in Europe and is
unlikely as of 2030, particularly given the growing recognition of the strategic
importance of semiconductors. Although European regulators are looking closely at
privacy and artificial intelligence (AI) and smartphone technology, they have not been
proposing higher levels of regulation for chips or chipmaking.
The situation caused an estimated global sales shortfall of US$210 billion in 2021 for
the automotive industry alone.11 In the EU, new car sales in 2021 were 9.7 million
vehicles, a record low since 1990 and 3.3 million lower than in 2019.12
As of March 2022, chip lead times averaged 26.6 weeks,13 with some being over 52
weeks.14 In May 2022, Intel’s CEO Pat Gelsinger predicted that the shortage would
last well into 2024, making this among the longest chip shortages in history.15
New capacity is being built (see below), but it takes time to develop and scale up.
That said, a combination of new plants and possibly weakening demand in the short
term from higher interest rates, inflation and falling consumer confidence should bring
the market back into balance.
General context
Chips come in many flavours
Some chips are used for complex digital processing or memory: these tend to have
billions of transistors, rely on the latest and most advanced semiconductor-
manufacturing technologies and are fundamental to computers, smartphones and data
centres. Other chips need more power, different materials or have circuits that need to
be less binary: neither on nor off, but somewhere in between (analogue or mixed-
signal chips). These work better on older and less advanced manufacturing
technologies. They are found in audio and video equipment, autos, medical devices,
radio and communications and industrial process controllers to control factory
machines.
In 2021, the average chip cost US$0.48, with a volume of 1.15 trillion chips sold and
total revenues of US$556 billion.16 However, there is a massive variation in price per
chip, from over US$1,000 for innovative chips in high-performance computers to a
few cents for a commodity chip with limited functions. Assuming 2026 EU vehicle
sales of about 14 million and average semiconductor content of US$700 per
vehicle,17 EU automotive demand for chips alone may generate nearly US$10 billion
annually.
So, multiple generations of chips are produced at any given time, and each has its
specific applications:
the many chips made using older technology are called trailing node
What constitutes advanced and trailing nodes changes over time. Presently, 10
nanometres (nm) and under is generally considered advanced; 65 nm and above is
trailing and 14–45 nm is intermediate. These nodes do not correspond exactly to the
physical size of the chips’ features, but they are currently just agreed-upon industry
shorthand terms for describing manufacturing processes.
As well as there being multiple generations of chips, there are also multiple types of
semiconductors. A high-level list includes analogue, connectivity, discrete, DRAM
and NAND memory, general purpose logic, microcontrollers (MCUs),
microprocessors and more. Analogue ICs, MCUs or power management chips are
almost exclusively made at trailing node processes, and smaller isn’t necessarily
better.
Many of the chips that European car manufacturers currently lack are trailing nodes
rather than the logic and memory chips used in computers, data centres and
smartphones. Trailing edge node chips are also used in health care devices and
household appliances. Factories and manufacturing primarily need trailing node chips
as well.
Apple is a fabless chip company that uses reference designs from UK-headquartered
ARM. Some companies with fabs both design and build chips and act as a foundry.
Samsung is an example, and Intel is moving toward this hybrid model.
So, the current wave of incentives and investments initiated by governments is more
about preparing for the next shortage, or maybe even the one after that, and remaining
competitive. We don’t know what the next trigger event will be, but there will
undoubtedly be one
Current context
Geopolitical risks to manufacturing clusters
The manufacturing of chips is highly concentrated in East Asia. In 2020, 73 per cent
of all chip manufacturing was done in four East Asian countries (China, Japan, South
Korea and Taiwan).22 Furthermore, 81 per cent of third-party semiconductor wafer
manufacturing (foundry) was done in just two countries (South Korea and Taiwan).
Taiwan alone accounted for 63 per cent of global foundry capacity.23
The current chip shortage shows how susceptible Europe is to supply disruption. The
concentration of semiconductor manufacturing in Japan, South Korea and Taiwan
poses a geopolitical risk to Europe due to the potential for conflict.
Even North Korea’s short-range Nodong and Puksokgong missiles can easily reach all
Japanese and South Korean semiconductor clusters. Mainland China also continues to
assert territorial claims over Taiwan and has threatened to retake the island by military
force.
The supply impacts of any conflict in the region could include damage (short term or
long term) to or the destruction of fabrication, testing, assembly and warehousing
facilities. Even if the infrastructure were undamaged, a successful Chinese takeover of
Taiwan would almost certainly lead to embargoes or other restrictions on chips
coming from the island.
Europe’s reliance on East Asian chip imports poses a specific geopolitical risk
regarding military capabilities and vulnerabilities, as the Western response to the
Russian invasion of Ukraine illustrates. As part of the sanctions imposed by the West
on Russia, there has been a severeembargo on semiconductors since February, with
Russian imports down by over 90 per cent.24
Semiconductors are vital for a modern military, and Russia has been reduced to using
chips intended for appliances in weapons.25 European armies would be equally
vulnerable to losing access to chips. Further, specific chips (high-power and radiation-
hardened semiconductors) are critical for military applications.
Something must be done!
The European Union has responded to the situation by announcing a €43 billion EU
Chips Act in February of 2022, which is now going through the EU approval process.
But Europe is not the only region responding.
In the US, the US$52 billion US CHIPS Act was introduced in January 2021 and
passed by Congress in July 2022, it received executive approval in August
2022.26 China’s spending is substantially higher. It has been on a journey for over a
decade towards greater semiconductor self-sufficiency and continues to build
domestic capacity. It plans to spend US$1.4 trillion in 2020–2025 on a range of
advanced technologies, including at least $150 billion on semiconductors.27
For the EU, the goal is to become more self-sufficient, as complete self-sufficiency is
unattainable. President of the European Commission, Ursula von der Leyen, said in
her remarks introducing the EU Chips Act: “It should be clear that no country – and
even no continent – can be entirely self-sufficient.”28
This is because of the diversity of semiconductors made by multiple processes – from
multiple semiconducting materials, relying on numerous other inputs (such as
specialised epoxies) and a vast array of manufacturing, testing and assembly
equipment. Sometimes, there is only a single manufacturer or source for a critical part.
Any given plant or cluster can be shut down by drought, earthquake, fire, flood,
military conflict, pandemic, power shortage or typhoon.
Handling China
China’s role in chip production poses different geopolitical risks to Europe. The
country is a significant source of industry concentration in the manufacturing of less
advanced chips. It is trying to become more self-sufficient and catch up to other
regions in making more advanced chips.
Given advanced node’s significantly higher value per unit, the EU’s focus on the most
advanced technology is understandable. Some commentators argue that being able to
play at 2 nm is a strategic necessity for Europe.30
Every approach inevitably has risks.31 Advanced node chip fabs cost up to US$20
billion to build. Operating costs are over US$1 billion per annum, and further billions
in ongoing investment over a plant’s lifetime are required to remain state of the art.
For an advanced node to be profitable, utilisation needs to average over 90 per cent.
Options to consider
At a minimum, any European semiconductor strategy should acknowledge that intermediate and trailing
node chips will continue to be manufactured for years. They may even be more important for the
European economy given the mix of large European industries that rely on chips other than more
advanced nodes and relatively small number of European companies that make products that use
advanced node chips.
Any EU strategy should therefore support all nodes rather than focusing only on advanced. More
controversially, the EU may want to reconsider the current focus on having a presence at 2 nm by 2030:
this may consume almost all the supporting funds, leaving little for other important technologies – many
of which are likely to be heavily relied on by European companies.
It is worth noting that the US$52 billion US CHIPS Act allocates US$2 billion specifically to build more
mature technology node fabs to produce trailing and intermediate nodes.32
One further problem is that getting to 2 nm is challenging. While all three major chipmakers (TSMC,
Samsung and Intel) have 2 nm on their roadmap before 2030, there is no guarantee that any will achieve
it by a specific date. Europe will almost certainly need to partner with one (or more) of the big three
players, as it is unlikely that any existing European manufacturer will have the scale and resources to
reach 2 nm by 2030.
If Europe works with just one of the three, and that player struggles with 2 nm manufacturing, the
region could fail to meet its goal. Intel has already announced that it is building more fabs in the EU. It
may be prudent for Europe to deliberately make sure it partners with TSMC and Samsung as well to
mitigate the risk of getting to 2 nm.
Samsung, which is building an advanced node (exact node unannounced) fab in Texas.
If the US wants to have fabs from all three of the big three, then why can’t Europe?
200 mm wafer versus 300 mm wafer
Chips are made on large-but-thin wafers of ultra-pure silicon crystal that have been
sliced, machined, etched, polished and manufactured from purified and processed
silicon dioxide (usually from relatively pure quartzite rock). There are three standard
wafer sizes: 150 mm or smaller (first made in 1983), 200 mm (1992) and 300 mm
(2002). The latter is considered state of the art; 450 mm wafers have been proposed
for 2025 but may be deferred even further.
At a high level, a 300 mm wafer contains twice as many dies as a 200 mm wafer:
larger wafers can drive lower prices and increase yields, and most advanced node
processes use 300 mm wafers. But there is no 1:1 correspondence: Manufacturers can
make advanced node chips on smaller wafers or trailing node chips on larger wafers.
As an example, the 2021 Bosch fab in Dresden creates 300 mm wafers but 65 nm
node.33
All other things being equal, fabs for 300 mm wafers cost much more than 200 mm
wafers: about five times as much in terms of initial construction and ongoing costs.
As of 2022, about five per cent of global wafer capacity (not revenues) is 150 mm or
smaller, 42 per cent is 200 mm and 53 per cent is 300 mm – but the latter is growing
faster. By 2024, we expect 59 per cent of capacity to be 300 mm. Europe does have
some production capacity for 300 mm wafers, but it is behind Asia and the US.34
Options to consider
There may need to be a discussion around the right mix of 200 mm and 300 mm wafer
plants in the EU. Although 300 mm is state of the art, there is a premium. It might
make more sense for Europe to build two 200 mm wafer fabs than a single 300 mm
wafer fab for any given node or type of chip.
Localisation vs diversification
One important debate globally around semiconductor self-sufficiency concerns
location. The issue is the extent to which design, manufacturing, materials, talent and
testing need to be located inside a given country or group of countries and which of
those capabilities can be located in a nearby country (near-shoring) or in a country
that is not nearby but that is considered geopolitically safe and reliable (‘friend-
shoring’).
The EU/Europe and the US face the same two challenges to becoming more self-
sufficient in semiconductors. Most of the world’s chips are manufactured far away
and are concentrated in a small set of factories/countries. These problems are even
more acute for advanced node chips and foundry manufacturing.
Building more fabs in the EU/Europe or the US is the obvious answer: this creates
more diverse sources, and the supply chain is much shorter. But full localisation for
EU/Europe is likely expensive and, some argue, impractical.
As the president of the European Commission said in her remarks, “Europe will build
partnerships on chips with like-minded partners, for example, the US or, for example,
Japan.”
For Europe to rely almost entirely on South Korea and Taiwan is unwise. But while
having second or third sources of supply in Japan or the US is good for
diversification, it does little for supply chain length: Silicon Valley, Taiwan and
Tokyo are roughly the same distance from Europe.
Although some chips are shipped by air, lower-value chips and all the machinery
necessary for building and operating fabs tend to travel by sea: as an example, TSMC
chartered an entire box ship to outfit a new fab.35 Diversifying supply to Japan or the
US would result in much longer shipping times for those in Europe: New York to
Rotterdam is eight to ten days, while Yokohama to Rotterdam is 28 days, compared to
one to two days for most intra-European transport by truck or rail.
It is important to note that, to some extent, the existing big three chipmakers may be
thinking of capacity expansion outside South Korea and Taiwan in a binary way. It
may be either Europe or the US: adding capacity in both regions may be unaffordable.
As a 2022 US brokerage report put it, “The demand for silicon is out there and the
capacity needed is there as well. The question is where this capacity will be built – in
the US or in another region … Either US chipmakers build new capacity in the US
with government support or they will take it elsewhere to ‘friendlier’ regions and get
the necessary financial support.”36
As seen in the figure 3 (from the EU), the region has several glaring areas of
weakness, with rectangles shaded pink/red.37 The EU consumes about 20 per cent of
chips globally, but the region’s share of chip design, design tools and IP,
semiconductor manufacturing and assembly, test and packaging are in the single
digits.
The situation is slightly better for materials and wafers. In terms of manufacturing
equipment, the region’s share is in line with consumption. It is worth noting that some
parts of the supply chain are worth more: the relative added value of each component
is indicated below, with 64 per cent of value-adds coming from chip design and
manufacturing.
Building new fabs in Europe but keeping all of the back-end processes in Asia
lengthens supply. Chips made in Europe would need to be shipped to Asia for back-
end processing, then shipped back to Europe for assembly or to final consumer or
enterprise buyers.
Materials
Europe is not self-sufficient in many raw materials required to make chips. A partial
list of raw and refined/processed materials needed to make various types of chips
include: argon, enriched isotopes (D2 aka deuterium, boron 11), fluorspar,
germanium, helium, hydrogen peroxide, high-purity solvents (IPA/PGMEA), krypton,
liquid hydrogen, sulphuric acid, tantalum, neon and xenon.
Some of these are not raw materials but ultra-pure versions needed for semiconductor
usage. Neon is abundant, but half of the world’s semiconductor-grade neon comes
from two companies with purification plants in Ukraine.38
Europe is either not self-sufficient in many of these or relies on a handful of sources,
some of which are subject to significant geopolitical or other risks, particularly around
sustainability and environmental impact. Current or future military conflicts could
deprive European fabs of necessary materials.
Most of the semiconductors we usually discuss are made wholly out of silicon. This
material has many virtues. It is especially good at running on minuscule amounts of
current at low voltages. This enables billions of transistors to be concentrated in a tiny
wafer and work for hours powered by a compact smartphone battery. Arrays of
silicon-based semiconductors are placed in data centres without generating excessive
heat.
But sometimes chips capable of handling high voltages and currents are needed for
applications such as solar panels, wind turbines, military, aerospace and (especially)
electric vehicles. Silicon can’t handle high voltages and currents well. New materials
such as gallium nitride (GaN) and silicon carbide (SiC) can run at hundreds or even
thousands of volts and are emerging multibillion-dollar annual markets.
ASML has over 60 per cent share and 100 per cent share of the equipment used for
deep ultraviolet lithography (DUV) and extreme ultraviolet (EUV) lithography
respectively, both cornerstones of advanced node manufacturing. In addition to
lithography, many other tools are required for deposition, etching and cleaning,
metrology, process control, ion implanting, plus test, assembly and material handling.
In most of these, Europe is generally not self-sufficient.
Design
Europe has a minimal share of the semiconductor design market. One of the most
important and valuable parts of the semiconductor industry is the fabless
semiconductor companies that create the designs for chips that are, in turn, made by
the foundries, such as TSMC and Samsung (and Intel going forward).
In 2020, fabless chip companies made up 33 per cent of global chip sales, up from 13
per cent in 2002.41 As of 2021, Europe’s share of fabless chip company headquarters
was less than one per cent of the global total. This compares to the US at 68 per cent
and Taiwan at 21 per cent (figure 4).42 As mentioned earlier, prioritising a European
foundry might help strengthen this sector.
Substrates
Europe currently has minimal capacity to make substrates. Increasing domestic chip-
manufacturing capacity is necessary. For the chips to function, they need to be placed
on substrates, which are the things that connect the chips to the circuit boards.
As of 2022, Europe has minimal capacity to make advanced IC substrates that support
the most advanced package types, such as flip chip ball grid array (FCBGA) or flip
chip chip scale package (FCCSP), found mainly in higher-end products. Europe has
limited capacity to manufacture even lower-end wired bonded substrates. Building
high-volume advanced substrate manufacturing facilities would likely cost US$1
billion each, and more than one would be needed.
Options to consider
It will be unlikely that the EU or Europe can become self-sufficient across all areas.
But there should be discussion around which are the more or less critical areas, which
can be built in Europe vs in reliable partners nations such as Japan or the US, and
whether Europe should create strategic reserves of vital raw materials or
equipment/tools.
Aside from manufacturing equipment (and, to a lesser extent, material and silicon
wafers), the EU is currently well below self-sufficiency (about 20 per cent of the
global market) in most areas of the semiconductor supply chain. In figure 3, boxes in
green are where the EU is punching at or above its weight, while those in red show
areas where the EU is underperforming relative to its 20 per cent share of global chip
consumption.
At a high level, there are good historical reasons why the actual fabrication, assembly
and testing of semiconductors tend to be geographically concentrated. Whether that
was a small region around the San Francisco Bay in the 1970s and 1980s (hence,
Silicon Valley) or East Asia today, putting plants, inputs, and outputs as close to each
other makes the process of making chips faster, more efficient and less expensive for
buyers and more profitable for manufacturers.
However, when it comes to design tools and IP, and (especially) chip design, there are
fewer compelling reasons for clustering: they can be located anywhere on the planet
… why not Europe? Europe would likely find it easier to succeed by focusing on
increasing its share in these areas rather than focusing too much on wafer
manufacturing and ATP. However, for Europe to realise its full potential in chip
design, it needs to get better at scaling. From our companion article How Europe Can
Ride the Tech Rollercoaster: “Europe’s tech investors will need to grow beyond early-
stage investments, while its tech companies will need the ambition and willingness to
super scale, rather than pursue an early exit and start over. Regulators can help by
supporting the development of ecosystems, talent pools with the skills to scale and
Europe-wide funds, rather than country-specific funds.”49
In terms of talent, Europe will likely need to both import talent and develop more
internally. Rules around visas, immigration and exchange programs may need to be
adjusted. It will also likely require intense cooperation between governments at many
levels and schools, universities and vocational institutions. Much of the talent needs to
be equipped with STEM or technical expertise, but the talent mix within semi is
evolving, and there is increasing need for software skills rather than the materials and
engineering skills prized historically.
SHOW MORE
ESG, the semiconductor industry and
localisation
Increasing semiconductor manufacturing raises questions concerning
environmental, social and corporate governance (ESG), which will also need to
be considered as part of the EU’s plans.
Global chip manufacturing will cause about 0.1 per cent of global greenhouse gas
(GHG) emissions in 2022. That number is projected to grow to 1.5 per cent by 2030 if
the industry doesn’t act.50 Further, the industry also uses a lot of water, produces
waste and relies on multiple materials whose extraction has environmental impacts.
Individual companies and industry associations are working hard to do better on all
counts.
But localisation and diversification of semi manufacturing and supply chains mean
some complex trade-offs are required. In short, the likely impact of the EU Chips Act
on ESG is neither purely positive nor negative, but rather a mix of the two.
In good news, making more chips in European countries with low GHG emission
intensity (France, Luxembourg and Sweden are all under 65 grams of CO2 equivalent
per kilowatt hour)51 and fewer chips in a country where more than 60 per cent of
electricity generation comes from coal is an ESG win.52 Equally, having more chip
workers in Europe, with generally higher wages and better working conditions,
suggests making Europe stronger in chip making and all parts of the supply chain
would be a positive achievement. Finally, shipping chips and equipment over shorter
distances will further reduce the carbon footprint of chips consumed in Europe.
But there’s bad news too. Building and operating new chip plants consumes many
resources: energy, water and carbon-emitting concrete. Suppose additional capacity is
added in Europe (and in the US and elsewhere) to reduce industry concentration and
supply chain risk. In that case, it is a virtual certainty that the industry will fall from
the current (and already unsustainable) 95 per cent utilisation level.53 That’s a good
thing in some ways, but – over time – if industry utilisation falls too far, the industry
will become less efficient and more wasteful.
One complicating factor is the EU’s reliance on Russian gas: semi manufacturing is
energy intensive and requires ultra-reliable sources.
Options to consider
In line with an overall improvement in ESG factors for the chip industry, Europe will
want to optimize the Chips Act and its goal of becoming more self-sufficient in chip
manufacturing. That said, given the need for electricity in manufacturing, the benefit
of moving more production from less green’ countries today to a ‘greener’ Europe
suggests that there will be a significant ESG benefit to the EU Chips Act.
Concluding thoughts
In our four scenarios for the future of tech in Europe, the most desirable is to have a
trillion-dollar tech company headquartered here. The tech desert is the least desirable,
while the greatness-divided option is a good outcome. The EU Chips Act and
associated strategies won’t guarantee a trillion-dollar company or avoid a tech desert –
but it helps. The cliché that ‘chips are the new oil’ is overused but it is a cliché for a
reason.
Reducing Europe’s chips dependency is critical to ensure that European industries that
rely on chips (which will increasingly be all of them) have ringfenced supply during
future shortages. Making sure that Europe is the site of at least some leading-edge
fabs and other critical parts of the supply chain (such as ATP and chip design) is also
of paramount importance.
The EU Chips Act is an essential beginning to all these goals. Much remains to be
done, choices must be made, partnerships with other countries will be necessary, and
talent needs to be developed or imported. But being stronger and more resilient in
chips means a stronger and more resilient tech sector in Europe, which means a
stronger and more resilient Europe economically, competitively, militarily and
politically.
Endnotes
Acknowledgments
In 2023, the aerospace and defense (A&D) industry witnessed a revival in product demand. In the
aerospace sector, domestic commercial aviation revenue passenger kilometers surpassed prepandemic
levels in most countries.1 This surge in air travel led to an increased demand for new aircraft and
aftermarket products and services. In the US defense sector, new geopolitical challenges, along with the
prioritization of modernizing the military, drove robust demand in 2023, particularly for weapons and
next-generation capabilities.2
The demand for A&D products and services is expected to continue into 2024. On the commercial side,
travel is likely to continue its upward trajectory. In the defense segment, demand for products is
expected to continue to increase as geopolitical instability grows. Furthermore, companies in emerging
markets, such as advanced air mobility, are expected to advance testing and certification as they
prepare for commercialization.
የኤሮስፔስ እና መከላከያ (A&D) ኢንዱስትሪ በምርት ፍላጎት ላይ መነቃቃትን አሳይቷል። በኤሮስፔስ ዘርፍ የሀገር
ውስጥ ንግድ አቪዬሽን ገቢ የመንገደኞች ኪሎ ሜትሮች በአብዛኛዎቹ ሀገራት ከተጠበቀው በላይ አልፏል። ይህ የአየር
ትራንስፖርት መብዛት ለአዳዲስ አውሮፕላኖች እና ገበያው ለሚያመጣቸው ምርቶች እና አገልግሎቶች ፍላጎት
መጨመር ምክንያት ሆኗል።
የ A&D ምርቶች እና አገልግሎቶች ፍላጎት እስከ 2024 ድረስ እንደሚቀጥል ። በመከላከያ ክፍል ውስጥ የጂኦፖለቲካዊ
አለመረጋጋት እያደገ ሲሄድ የምርት ፍላጎት እየጨመረ እንደሚሄድ ይጠበቃል. በተጨማሪም በታዳጊ ገበያዎች ውስጥ
ያሉ ኩባንያዎች፣ እንደ የላቀ የአየር ተንቀሳቃሽነት፣ ለንግድ ሥራ በሚዘጋጁበት ጊዜ የሙከራ እና የምስክር ወረቀትን
ማሳደግ ይጠበቅባቸዋል።