0% found this document useful (0 votes)
106 views6 pages

Quantum Computing in 5 Easy Pages 10-1-17

This document provides an overview of quantum computing, including what it is, the challenges of building quantum computers, and the current state of research. The key points are: 1) Quantum computing uses quantum properties like superposition and entanglement to perform computations, but measurements cause qubits to collapse probabilistically to classical values. 2) Building large-scale quantum computers is challenging due to noise which causes errors. Recent progress aims to build ~50 qubit systems to demonstrate key properties, but thousands of qubits will likely be needed for practical applications. 3) Many organizations are exploring different technologies like superconducting circuits, ion traps, and silicon qubits to build quantum devices and make them available via the cloud. The ultimate goal

Uploaded by

Prasad Thotakura
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
106 views6 pages

Quantum Computing in 5 Easy Pages 10-1-17

This document provides an overview of quantum computing, including what it is, the challenges of building quantum computers, and the current state of research. The key points are: 1) Quantum computing uses quantum properties like superposition and entanglement to perform computations, but measurements cause qubits to collapse probabilistically to classical values. 2) Building large-scale quantum computers is challenging due to noise which causes errors. Recent progress aims to build ~50 qubit systems to demonstrate key properties, but thousands of qubits will likely be needed for practical applications. 3) Many organizations are exploring different technologies like superconducting circuits, ion traps, and silicon qubits to build quantum devices and make them available via the cloud. The ultimate goal

Uploaded by

Prasad Thotakura
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Quantum Computing in Five Easy Pages

David Tennenhouse, Ittai Abraham, Jayneel Gandhi, Parikshit Gopalan, Rob Johnson, David Ott & Benny Pinkas
VMware, Inc.

What is quantum computing – and what isn’t it?


Quantum computing is a model of information and computation that builds on the quantum mechanical
view of atomic-scale systems and especially on three of its properties1:
o It is not possible to know the values of all of the properties of a system at the same time
o Those properties that are not known with precision must be described by probabilities.2
o Measuring devices are essentially classical devices, and measure classical properties.
In classical computing, we compute on bits that are either zero or one -- and a register of “n” bits can
represent one of 2n values. For example, a three bit register can have one of the values 000, 001, 010,
011, 100, 101, 110 or 111.
In quantum computing, information is stored and manipulated in qubits that, so long as they are not
measured, exist in a continuum representing the probabilities of the qubit being zero or one (subject to
some constraints). The state of the qubit is the superposition of the two possible states. Taking that a
step further, a 3 qubit register can simultaneously model the probabilities of each of the 8 states that a 3
bit classical register could take on, i.e., the superposition of the 2n possible states.
To a limited degree, the quantum register appears to be 2n times more expressive than the classical
register. In the earliest days of quantum computing this observation generated huge excitement. Some
pundits erroneously suggested that a quantum computer could solve complex puzzles, such as
cryptography, by simultaneously testing all of the possible answers and magically extracting the one
that “works”. It would have meant that a quantum computer could perform 2n computations using the
same number of operations (though these would be quantum operations) a classical computer would
require to perform a single computation.
The catch, of course, comes from the 3rd of the above properties. When we measure the state of a qubit
register it will collapse to a classical value that is probabilistically determined (based on its quantum
state prior to collapse). For example, in the case of our 3 qubit register, we will only be able to extract
one of the eight classical values, which is far less exciting than some had hoped for. For an n-bit crypto
puzzle, we can’t extract 2n outputs, i.e., we can’t simultaneously try all of the 2n possible keys.
On the positive side, the probability of any one value being extracted is determined by the state of the
qubit when it collapses. That is still quite exciting from a computational (and crypto) perspective, since
we can perform a series of quantum operations on the register prior to measuring it. If we are really
clever in how we go about things, then we can take on some interesting challenges.

Can Quantum Computers be built – at interesting scales?


We don’t yet know the answer to this question. It is possible that we will have substantive affirmative
evidence in the coming 2-18 months as both IBM and Google have planned attempts to build systems
of ~48-50 qubits. Although the really interesting scale for quantum computers (QCs) is ~2000 qubits,
researchers believe that many key properties will be demonstrated (or not) with these 48-50 qubit
prototypes.3 If they are successful, it will then take another 5-10 years to build full scale systems.
The challenge in building practical QCs is noise which leads to errors. Amongst other things, real world
noise causes qubits to lose their coherence, i.e., their state collapses. So, one challenge is to build
qubits that retain their coherence for long enough to perform some number of operations on them with
an acceptably low rate of errors. Another challenge is to build qubits/registers in such a way that the

1
These are simplified/paraphrased from the so-called Copenhagen interpretation.
2
Although classical probabilities are only positive, quantum probabilities can have negative and/or imaginary
components. This enables interesting operations, e.g., probabilities can cancel each other out when they are summed.
3
Some teams claim to have already exceeded this scale, but with systems that are more “analog” in nature.

July 2017 For Internal Use Only


noise grows sub-linearly with the size of the register. Under certain assumptions around the noise
model it may be possible to utilize error correction. In that case, the challenge won’t be to eliminate
errors entirely, but to get the underlying rate low enough that the gain from error correction is less than
its cost. On the pessimistic side, some argue that physics precludes the realization of QC’s, i.e., that at
the atomic scale the real world is fundamentally too noisy to build QCs with large numbers of qubits.
So, the coming months should be very exciting. Although the pace of recent progress at ~17 qubits
suggests Google and/or IBM will be successful, it is entirely possible that these experimental
researchers are in a headlong run into an atomic scale brick wall. We live in interesting times!
Finally, the hardware being developed to implement QCs might prove useful even if it does not
successfully overcome noise sufficiently to implement quantum computation. These systems could
instead be used to implement the quantum equivalent of the analog computers used in the 1950’s and
1960’s, i.e., systems that can be configured to perform a wide range of very large scale simulations,
albeit in the analog domain.
Why has the experimental pace accelerated?
Since QC’s leverage atomic-scale effects, progress in creating nanoscale materials, devices, etc. has
greatly increased the ability of technologists to implement multi-qubit designs. The tapering of Moore’s
Law, which posited a limit as to how quickly new technology could be adopted, is likely another
accelerant in the sense that this tapering has freed up some of the capacity of computation-thirsty
companies, such as Google, to explore and consume this new technology (vs. classical computing).
What technologies are being used to build Quantum Computers – and by whom?
For a long time to come, QCs are likely to be custom accelerators attached to classical computers, in
much the way that floating point processors were initially introduced. Perhaps more importantly, a
number of the key hardware developers are attaching their systems to the cloud, making them
accessible to a wide range of network-based users.
Since all forms of matter (and interactions with light/energy) exhibit quantum behavior at an atomic
scale, a multiplicity of approaches are being pursued:
 Google and IBM are both investigating the use of superconducting materials, creating
customized chips that are presumably cooled to millikelvin temperatures.
 Microsoft is leading the charge on a more radical approach known as topological qubits that is
further from realization but is believed to be fundamentally more resistant to noise/errors.
 Intel’s research partners are exploring the use of silicon in conjunction with quantum dots4.
 Others are exploring trapped ions, single electrons, photons, fermions, cold atoms, etc..
 There are numerous startups, the most prominent of which is D-wave, who have built a very
specialized type of processor, even by QC standards, whose goal is to implement quantum
annealing, i.e., the use of quantum properties to search for the global minimum of a function.
 It is reasonable to assume that governments are having systems built for their own use.
What is quantum supremacy?
Today, Google and IBM are hot on the quest to “bring evidence” of quantum supremacy. Supremacy is
the term used to express the notion that a QC should ultimately be able to compute everything that a
classical computer can and do so in no more operations in all cases and in far fewer operations in many
cases of interest. So, it is at least as good in all cases and much better in some.
Of course, this only speaks to the number of operations and not their cost or speed. Even if the noise
issue discussed above can be overcome, it might never be the case that a quantum computer will
economically outperform a classical computer on the problems that are addressed today. On the other
hand, if quantum computers of ~2000 qubits can be built then they will be able to solve problems that

4
Nanoscale dots of material that are sufficiently small that they exhibit many of the properties of individual atoms.

July 2017 -2- For Internal Use Only


are beyond the reach of classical computers. In the meantime, researchers plan to use ~50 qubit scale
prototypes to “bring evidence” that their work can lead to quantum supremacy.
How much better could Quantum Computers be – on “typical” problems?
Although QC’s are not the Nirvana some once hoped, the development of a handful of key algorithms
makes it clear that, if they can be built, they will have real and substantive impact on problems that are
large enough that they are beyond the reach of classical computers.
One algorithm that appears to have broad applicability is Grover’s algorithm, which is loosely related to
searching a database (or table) of N entries for a single entry that matches a key. A classical computer,
which must check every entry, will require on the order of N steps. Of course, if we had the mythical
computer some had hoped for we would be able to check every entry in parallel, i.e., in a single step.
While not as magical, a QC using Grover’s algorithm can complete the search in √𝑁 steps which, for
large N, is an enormous improvement over classical computers. Furthermore, a wide range of
interesting problems can be transformed into problems addressable through this algorithm.
To put the benefit of √𝑁 speedup in perspective, brute force search of a database with a billion entries
would take order 2**30 steps with a classical computer but only 2**16 steps (plus error correction, etc.)
for a quantum computer. Since today’s largest classical computers can undertake such searches, this
particular example might not justify building a quantum computer but what if the problem is only a “little”
bigger, e.g., a database with 2**48 entries or even 2**64 entries? This is where quantum supremacy
comes into play, i.e., problems of that scale will be well beyond the reach of classical computers,
making quantum computers attractive even if they are burdened with a high cost and latency per
operation.
On the proverbial third hand, the bulk of the practical problems we use computers for today are not of
the scales where supremacy comes into play. Even if we wanted to search tables of such enormous
scale, it is likely that they would be sparse in some ways that classical systems could leverage and/or
that we would find heuristics to wrangle them down to manageable size. For example, while training
large deep neural nets (DNN’s) might appear to exceed the capacity of classical computation,
researchers are finding ways to prune those networks so as to reduce the degree of computation that is
actually required. So, while, some researchers believe that the evaluation of DNN’s may be a candidate
for quantum computation, it is not at all clear it will be a game-changer.
One class of applications that is of quantum supremacy scale is the atomic (or sub-atomic)
simulation/modelling of physical systems, e.g., in chemical engineering and materials science. In fact,
Feynman is often credited with having postulated that quantum computers would be required in order to
model quantum mechanical systems.
It is also worth noting that, even if quantum computers can’t be built as envisaged, problems such as
the simulation of physical systems and/or the training of DNN’s might benefit from the quantum scale
analog computing approach previously described.

Are there problems and/or algorithms where Quantum Computers could do better than √𝑵 speedup?
Yes there are – and, in a remarkable quirk of mathematics, one such algorithm addresses factoring, a
problem that has been highly resistant to attack by classical algorithms.
Shor’s algorithm can be used to factor an integer N, which can be represented with log(N) bits, using on
the order of log(N)2 quantum operations. This is amazingly better than the best known classical
factoring algorithms which require an (almost) exponentially larger number of operations.
Factoring is so hard for a classical computer that we have to come somewhat close, at least in
complexity terms, to trying all of the possible answers. So, let’s go back to thinking about the mythical
quantum computer that could do just that. Intuitively, the problem with its realization is that, prior to
measurement, all of the answers would be super-imposed on each other within the many qubit register,
i.e., they would all be screaming out for attention at the time of collapse and we would be unlikely to
measure the “right” answer.

July 2017 -3- For Internal Use Only


But what if we could apply quantum operations in some way that causes the answers that are not “right”
to cancel each other out, thereby increasing the probability, of the “right” answer being revealed by our
measurement? It turns out that the quantum Fourier transform (QFT) algorithm can do just that for
certain types of problems, i.e., those that have an underlying periodicity to them.
Shor found enough structure in the factoring problem to effectively reduce it to solving such a problem.
The QFT doesn’t solve the factoring problem in its entirety but it reduces it to a point where a more
tractable, though still large, number of “educated guesses” can be applied to find the answer. So, in
theory, a combination of quantum and classical computing can be used to dramatically reduce the
complexity of factoring.
If Shor’s algorithm can slay the factoring dragon and Grover’s algorithm can yield quadratic gains at
large, then there is cause to hope (or fear, depending on one’s perspective) that other quantum
algorithms of equal importance remain to be discovered. On the other hand, those algorithms were
discovered in 1994 and 1996, respectively, and remain the high water marks of the space. While 20
years is the blink of an eye in terms of mathematics, the sparseness of high impact algorithmic results is
disconcerting to many quantum computing advocates.

What are the implications for Classical Cryptography?


Although it appears sweeping, Grover’s algorithm has surprisingly limited implications for many
symmetric / shared key cryptographic schemes, such as AES & SHA. These schemes are different from
the business and scientific problems we typically address with classical computing in that they are
intentionally designed to be hard problems, i.e., we deliberately encrypt messages with fairly
large/lengthy keys. Although Grover’s √𝑁 speedup is huge for most problems, it can be readily offset
by simply doubling the length of the keys (which squares the number of operations required). This is an
inconvenience but not out of reach. So, shared key systems with longer keys may remain safe until
QCs with many more qubits can be built – or until a new algorithm is discovered.
The same is not true for today’s asymmetric key exchange schemes, i.e., those such as RSA that are
used to enable public keys. They are much more subject to attack by quantum computers, since they
depend on the difficulty of factoring or other schemes that are vulnerable to Shor’s algorithm.

Are there quantum-resistant crypto schemes?


Some approaches to encryption are believed to be fundamentally resistant to speedups enabled by
quantum computing and work is starting to create crypto standards for some of them.
An outstanding question is how enduringly hard the mathematical problems underlying them prove to
be, i.e., might a new standard believed to be quantum-resistant suddenly come under attack as a result
of an algorithmic discovery? When factoring was chosen as the inspiration for RSA, it represented a
problem that had been worked on for hundreds of years, and so, there were empirical reasons to
believe that the problem was hard and the pace of improvement would be relatively slow. To date, this
line of reasoning has proven correct, i.e., factoring has proven to be robust to classical computation. It
is not clear what the basis will be for selecting a quantum-resistant crypto standard, let alone what the
basis will be for claiming that it is robust to algorithmic innovation.

What are the near/moderate term implications for our customers?


Quantum computers of sufficient complexity to crack classical cryptography are not believed to exist, let
alone be in use, today. Nonetheless, the possibility that they will come into use over the next decade –
and especially the possibility that access to them will be made widely available via public clouds —
poses two significant and relatively near term challenges to our customers:
 Some confidential information, such as medical records, is long-lived in nature. Accordingly,
customers should take steps against having information that is encrypted with today’s best
practices being collected today so that it can be decrypted at some point in the future. This is
particular true of large collections of such information, such as backups of entire databases.

July 2017 -4- For Internal Use Only


 The modernization of cryptographically protected systems, in terms of both their software and
the protected data can be a lengthy process. At an organizational level, especially within
governments, this process can span 1-2 decades. Accordingly, customers should take action
now to plan for the modernization of such systems and for the transition of all data to them.
Given how widely crypto and crypto-signatures are now used, these challenges are akin to a rerun of
Y2K – but with an uncertain deadline! The NSA has provided guidance5 related to the evolution of
cryptographic systems during the current pre-quantum era. For shared key systems (AES/SHA), the
guidance amounts to recommendations around the use of longer keys. For public key systems (RSA),
the guidance is also to adopt longer keys, but with the caveat that even longer keys will only be
resistant to early generations of quantum computers and those systems will eventually need to be
replaced with alternatives based on TBD quantum-resistant cryptography.

What near/moderate term actions should VMware be taking with respect to Cryptography?
 Create an inventory of cryptography used in VMware products and a timetable for the adoption
of the NSA’s guidance regarding key lengths.
 Investigate the modernization plans for cryptography sourced from third parties and/or open
source software; partner/participate in those plans as appropriate.
 Plan to incorporate crypto agility into our products, i.e., the ability to update the crypto
algorithms, protocols and key links of deployed software.
 Take steps to ensure that shared keys used to protect long term customer data (e.g., archival
backups, data-at-rest, etc.) cannot easily be obtained by collecting messages encrypted with
public keys in the near term and quantum decrypting them in the future.
 Participate in NIST and the efforts of other governments and standards bodies to create
quantum-resistant systems, especially public key and/or key distribution systems
 Investigate opportunities for VMware to create and operate a quantum-resistant key distribution
system/infrastructure on behalf of its customers. Note that this would likely not be a quantum
key distribution (QKD) system (see appendix).

What other actions should VMware be taking with respect to Quantum Computing?
 Monitor the work of researchers attempting to build quantum computers. Calibrating their
timelines so that we and our customers don’t invest either too early or too late will be very
important. The results of the near term Google and Microsoft efforts to demonstrate quantum
supremacy are likely to be loaded with caveats and will require careful analysis/interpretation.
 Stay abreast of developments on quantum algorithms and novel applications of quantum
computing. Gain a modicum of familiarity with programming some of the early prototypes
accessible via the cloud. This could be a good venue for intern projects.
 Update this document every 6 months to reflect changes in the above.

Acknowledgements and Disclaimers


Many members of the VMware research group contributed to this brief introduction, which omits key
technical details, draws on numerous WWW-based sources and is subject to change as the state of
quantum computing evolves. Errors introduced as a result of numerous simplifications and omissions
are solely the fault of the first author.

5
https://www.iad.gov/iad/programs/iad-initiatives/cnsa-suite.cfm and www.iad.gov/iad/library/ia-guidance/ia-
solutions-for-classified/algorithm-guidance/cnsa-suite-and-quantum-computing-faq.cfm

July 2017 -5- For Internal Use Only


Appendix: What is Quantum Key Distribution?
A longstanding challenge for practical cryptographic systems is the generation and exchange of shared
keys. Once a key has been shared by the communicating parties, symmetric encryption can be used to
secure their communications – but what channel can they use to exchange that key?
Quantum Key Distribution (QKD), which is very different from quantum computing, leverages selected
properties of quantum mechanics to allow a shared key to be generated/exchanged with the assurance
that it cannot be intercepted without detection. For example, polarization can be used to encode
information within an entangled pair of photons. If one photon from each pair is sent to each of the
communicating parties, then their quantum properties will ensure that: the parties will both arrive at
equivalent results when they measure the polarity of their individual photon; and that no other party will
have attempted to make such a measurement without being detected.
Although the actual protocols followed are considerably more elaborate, multiple approaches to QKD
have been demonstrated by government/university researchers, startups and larger companies.
VMware should:
 Stay abreast of international standardization efforts and ensure that their use cases,
technologies and IP regimes are not unfriendly to VMware’s business interests.
 Cultivate awareness of the practicality and economics of QKD, including awareness of key
barriers to deployment and the status of efforts to overcome them.
 Monitor and anticipate the adoption of QKD, especially amongst our government and critical
infrastructure (e.g., financial industry) customers.

July 2017 -6- For Internal Use Only

You might also like