0% found this document useful (0 votes)
69 views34 pages

12 PDF

The document is a 12,687-word paper on post-quantum cryptography. It provides background on post-quantum cryptography and how quantum computers pose a threat to modern cryptosystems. It then explains how quantum computers can efficiently factor large numbers using Shor's algorithm, which threatens the security of RSA encryption. The document also discusses how quantum computers perform calculations on superpositions of states using qubits, and how the quantum Fourier transform can be applied to extract frequency information and factor numbers.

Uploaded by

aarushi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
69 views34 pages

12 PDF

The document is a 12,687-word paper on post-quantum cryptography. It provides background on post-quantum cryptography and how quantum computers pose a threat to modern cryptosystems. It then explains how quantum computers can efficiently factor large numbers using Shor's algorithm, which threatens the security of RSA encryption. The document also discusses how quantum computers perform calculations on superpositions of states using qubits, and how the quantum Fourier transform can be applied to extract frequency information and factor numbers.

Uploaded by

aarushi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Similarity Report ID: oid:27535:47926924

PAPER NAME AUTHOR

12.pdf mb

WORD COUNT CHARACTER COUNT

7687 Words 42069 Characters

PAGE COUNT FILE SIZE

26 Pages 211.9KB

SUBMISSION DATE REPORT DATE

Dec 12, 2023 2:25 PM GMT+5:30 Dec 12, 2023 2:28 PM GMT+5:30

36% Overall Similarity


The combined total of all matches, including overlapping sources, for each database.
21% Internet database 12% Publications database
Crossref database Crossref Posted Content database
26% Submitted Works database

Summary
57
CHAPTER 1

INTRODUCTION

1.1 BACKGROUND

Cryptography plays an extremely crucial role in ensuring security of online communication,


31
in addition to protecting the integrity of cars and implanted medical devices. However, the
advent of large-scale quantum computers poses a significant threat to many widely used
cryptosystems, rendering them susceptible to complete compromise. In response,
48
post-quantum cryptography operates on the premise that potential attackers possess powerful
quantum computers, and it aims to maintain security under such circumstances. This
emerging field of research has achieved notable successes by identifying mathematical
operations where quantum algorithms offer minimal speed advantages. Subsequently,
40
cryptographic systems are developed around these operations to enhance security. The
68
primary challenge in post-quantum cryptography lies in satisfying the demands for
cryptographic usability and flexibility without compromising confidence in security.

Until the 1970s, the conventional method for sharing private information involved meeting in
person to exchange a secret key, which served the dual purpose of encrypting and decrypting
messages—a practice known as a symmetric key algorithm. The security of the messages
relied on the confidentiality of this shared key.

69
In 1977, three scientists, Riverst, Shamir, and Adelman, made a significant breakthrough in
encryption. This groundbreaking method, now recognized by their initials as RSA, operates
as follows.

Every person has two really big prime numbers, all their own which they keep secret. They
3
multiply these numbers together to get an even bigger number, which they make public for
everyone to see. If we want to send someone a private message, we use their big public
3
number to garble our message. And we garble it in such a way that it is impossible to
36
ungarble without knowing the two prime factors that made that number. This is an
asymmetric key system, since different keys are used to encrypt and decrypt the message. So
it's easy for our intended recipient to decode, but impossible for everyone else, unless they

10
can factor that large public number.

52
Now, someone could try to factor it, using a supercomputer, in the best-known factoring
23
algorithm the General Number Field Sieve, but modern cryptography uses prime numbers
that are around 313 digits long. Factoring a product of two primes this big, even with a
3
supercomputer, would take around 16 million years, but not on a quantum computer. In
normal computers, a bit can only be in one state at a time, either a zero or a one. So if we had
two bits, they could be in one of four possible states, 00, 01, 10 or 11. If we assume each of
3
these states to represent a number, 0, 1, 2, or 3 and if we want to do a calculation, for
3
example, raising seven to the power of one of these numbers, we can only do it for one state
at a time, in this case seven squared and so we get the answer 49.

3
Quantum computers consist of qubits which also have two states, zero or one. But unlike a
classical bit, a qubit doesn't have to be in just one state at a time. It can be in an arbitrary
3
combination of those states, a superposition, of zero and one. So if we have two qubits, they
can exist simultaneously in a superposition of 0, 1, 2, and 3. When we repeat the same
3
calculation, it will perform the calculation for all of those numbers at the same time. And
we're left with a superposition of the different answers. 1, 7, 49 and 343. If we add another
qubit, we double the number of possible states. So with three qubits, we can represent eight
9
states, and thus perform eight calculations all at once. Increase that number to 20 qubits, and
we can already represent over a million different states, meaning we can simultaneously
compute over a million different answers.

5
With 300 qubits, we can represent more states than there are particles in the observable
universe. This sounds incredibly powerful and it is, but all of the answers to the computation
5 7
are embedded in a superposition of states, but we can't read out this superposition. When we
make a measurement, we only get a single value from the superposition at random, and all the
9
other information is lost. In order to harness the power of a quantum computer, we need a
smart way to convert a superposition of states into one that contains only the information we
want. This is an incredibly difficult task, which is why for most applications, quantum
computers are useless.

So far, we've only identified a few problems, where we can actually do this, but these are
11
56
precisely the problems that form the foundation of nearly all the public key cryptography we
use today.

7
In 1994, Peter Shor and Don Coppersmith figured out how to take a quantum Fourier
transform. It works just like a normal Fourier transform, apply it to some periodic signal, and
it returns the frequencies that are in that signal.

Now this may not seem particularly interesting but consider this. If we have a superposition
7
of states that is periodic, that is the terms in the superposition are separated, by some regular
amount, we can apply the quantum Fourier transform and will be left with states that contain
19
the frequency of the signal. We can measure this. The quantum Fourier transform allows us to
extract frequency information from a periodic superposition, and that is going to come in
handy.

How does a quantum computer factor the product of two primes much faster than a
conventional computer?

17
Let's say we have a number N, which is the product of two primes, p and q. Let's set N equal
27
to 77. Pick a number g that doesn't share any factors with N. If we multiply g by itself over
27
and over and over, we will always eventually reach a multiple of N plus one. In other words,
20
we can always find some exponent r, such that g to the power of r, is a multiple of N plus one.

Let's see how this works. If we pick any number that is smaller than 77, for example the
number eight. This number doesn't share factors with 77. And if we were doing this with big
6
primes, it would also be extremely unlikely that we just happen to pick a number that shares
factors with N.

Multiply eight by itself once, twice, three times four times, and so on, raising eight to ever
higher powers and then divide each of these numbers by 77. We're not interested in how
many times 77 goes into the number, just the remainder, what's left over, because at some
point, 77 should divide one of these numbers with a remainder of exactly one. So eight
divided by 77 is zero with a remainder of 8, 64 divided by 77 is zero remainder 64. 512
divided by 77 is six remainder 50. And as we keep going, we get remainders of 15, 43, 36,
12
57, 71, 29, and finally one.

We've found the exponent R that satisfies this equation. But how does this help find the
factors of N? We rearrange the equation to bring one over to the left-hand side, and then we
5
can split it into two terms. And now as long as r is even, we have one integer times another
6
integer is equal to a multiple of N. This looks similar to p times q equals N. Since we know
7
that p and q are on the right-hand side of this equation, they must also be on the left-hand side
just multiplied by some additional factors.

20
Since r was 10, the two terms on the left-hand side are eight to the power of five plus one,
34
32,769 and eight to the power of five minus one, 32,767. These two numbers probably share
factors with N.

To find them we use Euclid's algorithm. If we want to find the greatest common divisor of
58
two numbers, take 32,769 and 77, divide the bigger number by the smaller one and record the
remainder. In this case, 32,769 divided by 77 gives a remainder of 44. Then shift the numbers
one position left and repeat. Now we divide 77 by 44 and we get a remainder of 33. Repeat
6
the process again. 44 divided by 33 gives a remainder of 11 and again 33 divided by 11
6
equals three remainder zero. When the remainder is zero, the divisor is the greatest common
factor between the two numbers you started with. In this case, it's 11, which is indeed a factor
of 77 and 32,769. We could do the same procedure with the other number or just divide 77 by
11 to get seven, its other prime factor[1].

17 5
To recapitulate, if we want to find the prime factors p and q of a number N, first, make a bad
guess, g, second, find out how many times r we have to multiply g by itself to reach one more
than a multiple of N. Third, use that exponent to calculate two new numbers that probably do
share factors with N. And finally use Euclid's algorithm to find the shared factors between
those numbers and N, which should give you p and q.

Now, we don't need a quantum computer to run any of these steps, but on a classical
7
computer, this method wouldn't be any faster than other methods. The key process that a
quantum computer speeds up is step two, finding the exponent we raise G2 to equal one more
than a multiple of N. To see why, let's go back to our example, where eight to the power of 10
13
is one more than a multiple of 77. If we keep going past eight to the power of 10, to 8 to the
11, eight to the 12, and so on. We get remainders of 8, 64, 50, 15, 43, 36, 57, 71, 29, and
again one. The exponent that yields a remainder of one is 20, which is 10 more than the first
exponent that yielded a remainder of one.

So we know that 830 and 8 40, 8 raised to any power divisible by 10 will also be one more than
a multiple of 77. If you pick any remainder, for example 15, the next time we find that same
4
remainder, the exponent will have increased by 10. So we can find the exponent R that gets
us to one more than a multiple of n, by looking at the spacing of any remainder, not just one.

7
We can see this is a repeating pattern, we can go back to the beginning and any number raised
to the power of zero is one. That is actually the first remainder. So it must also appear when
32
the cycle starts again. Now we are ready to use a quantum computer to factor any large
product of two primes.

First we split up the qubits into two sets. The first set we prepare in a superposition of zero
and one and two and three and four and five and six and seven and eight and nine, all the way
15
up to 101,234. This is a huge superposition, but if we had perfect qubits, it would require only
17 6
around 4,100. The other set contains a similar number of qubits all left in the zero state for
4
now. Now we make our guess G, which most likely doesn't share factors with N. We raise G
6
to the power of the first set of qubits and then we divide by N and store the remainder in the
4
second set of qubits leaving the first set of qubits as it was. Now we have a superposition of
all the numbers we started with and the remainder of raising G to the power of those numbers
15
divided by N. And through this operation, we have entangled our two sets of qubits, but we
can't measure this superposition.

4
If we don't measure the entire superposition, but only the remaining part, we will obtain a
4
random remainder. But this remainder won't occur once. It will occur multiple times every
time it comes up in the cycle. Imagine we were doing this with the example from before with
N equals 77 and G equals eight. If the remainder we measured was say 15, then there would
be multiple terms in our superposition. Because there are multiple exponents you can raise
G2 that give this same remainder, exponents 4, 14, 24, 34, and so on. They are each separated
4
by 10, and that value is the exponent that satisfies our equation. So more generally after
14
measuring the remainder, we will be left with a superposition of states that all share the same
remainder and the exponents will all be separated by the same amount r. This is the number
we are looking for.

4
Since the remainder is now the same for all states, we can put it to the side and we now have
a superposition that is periodic. Each term is separated from its neighbours by an amount R.
If we now apply the quantum Fourier transform to this superposition of states, we will be left
with states containing one over R.

4
So all that's left to do now is perform a measurement and find R by inverting it, and that's it
15
for the quantum part. As long as r turns out to be even we can use r to turn our bad guess g
into two numbers that likely share factors with N. And as long as these terms themselves are
42
not a multiple of N, we can use Euclid's algorithm to find the factors of N and break the
encryption.

1.2 MOTIVATION
The computational strength of a quantum processor doesn't merely double with the addition
of qubits; rather, it experiences an exponential increase in power when a single qubit is
introduced. Quantum processors can execute quantum algorithms at twice the speed for every
additional qubit, rendering them exceptionally efficient in addressing cryptographic
challenges. The requirement for this heightened capability involves several thousand flawless
qubits. However, the current qubits exhibit imperfections, necessitating the incorporation of
extra qubits to serve as redundant information.

In 2012, the assessment indicated a necessity for a billion physical qubits to compromise
RSA encryption. Within five years, this estimate diminished to 230 million, and further
advancements in 2019 lowered the figure to a mere 20 million physical qubits. Despite the
progress observed in IBM's quantum computers, the present qubit count remains far below
this requirement. Nonetheless, the trajectory of advancement suggests exponential growth,
prompting speculation about when the convergence of these curves will occur, rendering
existing public key encryption vulnerable.

15
Acknowledging the impending threat, scientists have actively sought novel methods of
19
encrypting data resilient to both conventional and quantum computer attacks. In 2016, the
National Institute of Standards and Technology (NIST) initiated a competition to identify
encryption algorithms impervious to quantum computers. The global cryptographic
community submitted 82 diverse proposals, subjected to rigorous testing that led to the
identification of four algorithms on July 5th, 2022, subsequently integrated into NIST's
post-quantum cryptographic standard.

Figure 1

11
1.3 HARVEST NOW, DECRYPT LATER

The concept of harvest now decrypting later takes on renewed significance as quantum
21
computing threatens traditional cryptographic methods.This theory centers on concerns that a
nation-state will gain access to currently encrypted data and then decrypt it at a later time
using a quantum computer. It embodies the proactive collection of data or information with
11
the anticipation of decrypting or processing it in the future. It involves long term storage of
encrypted data that is currently unreadable in the hopes that future advances in decryption
technology will make it readable.Although quantum computers are still in the research and
development stage, they threaten encrypted data now. As the landscape of cryptography
evolves, adopting this approach involves a strategic amalgamation of accumulating data and

16
applying encryption methodologies resilient to quantum threats, thereby ensuring the
8
integrity and confidentiality of information. Failure to transition before sufficiently powerful
quantum computers are realized will jeopardize the security of public key cryptosystems
which are widely deployed within communication protocols, digital signing mechanisms,
authentication frameworks, and more.

Within this overarching exploration, the report contemplates the implications of "Harvest
Now, Decrypt Later" on the NIST-selected PQC algorithms. While these algorithms exhibit
resilience against current cryptographic threats, their susceptibility to harvesting attacks casts
a shadow on the long-term security of encrypted data. This nuanced perspective adds a layer
30
of complexity to the ongoing discourse surrounding the future of cryptographic standards.

The report meticulously analyses the strategies and countermeasures that can be employed to
mitigate the risks posed by "Harvest Now, Decrypt Later." It scrutinizes the cryptographic
assumptions and mechanisms underpinning NIST-selected candidates, shedding light on their
vulnerability to potential future decryption methods. This foresight-driven analysis serves as
a crucial lens through which the robustness of PQC algorithms can be evaluated not only
against the imminent quantum threat but also within the broader context of strategic data
harvesting.

As we navigate the delicate balance between technological advancement and cryptographic


resilience, addressing the "Harvest Now, Decrypt Later" challenge emerges as a pivotal point.
The report not only identifies the inherent risks but also advocates for a holistic approach to
fortify cryptographic systems against the relentless march of computational progress. In
doing so, it contributes to the ongoing discourse on cryptographic standards, providing a
forward-looking perspective that considers the dynamic interplay between data security,
computational power, and the ever-evolving cryptographic landscape.

1.4 ORGANISATION OF REPORT


40
Chapter 1 of this report provides the background of post quantum cryptographic algorithms in
various applications. It is followed by the motivation behind the project and why there’s a
need to make the switch from existing encryption algorithms to new quantum resistant
65
algorithms. The last section describes the new content of “Harvest Now, Decrypt later”,
17
which poses a major threat to current data, even before the production of efficient quantum
computers.

46
Chapter 2 of this report covers the related work in the above described field. With the coming
of Shor’s algorithm and Grover’s algorithm, institutions started recognizing the threat
imposed on existing data by quantum computers. The next section will discuss migration
management from algorithms unsafe from quantum computers to quantum safe algorithms.
This chapter also contains the development of other significant elements like PQXDH key
agreement protocol by Signal.

50
Chapter 3 of this report deals with literature review in the field of Post quantum
cryptography, the initiative by NIST to standardise the post quantum cryptographic
62
algorithms to address the concerns about the potential future threat quantum computers might
pose to current cryptographic systems. The following section studies the standardisation
parameters considered by NIST in the three rounds conducted. It is followed by the 4
qualified algorithms by NIST, the idea behind them and the security.

Chapter 4 of this report talks about the concerns and limitations of the proposed algorithms or
standardised by NIST. Multiple threats to the algorithms are discussed including plain text
attack. These threats can make the algorithms moot for use in real systems if security is
compromised to a large limit.

Chapter 5 summarises and concludes the survey and the world done till now. First part
summarised the algorithms and their characteristics along with the standardisation process.
Future steps talk about the potential future development and the scope in this field of
quantum studies including deep security assessment and spreading awareness about the
implementation of quantum algorithms across global institutions.

18
CHAPTER 2
RELATED WORK

2.1 THREAT TO CLASSIC ENCRYPTION ALGORITHM


12
2.1.1 Shor’s Algorithm
The goal of encryption is to garble data in such a way so that no one who has the data can read it
unless they’re the intended recipient. And the encryption of pretty much all private information
39
sent over the internet relies immensely on one numerical phenomenon, that is , it’s extremely
hard for existing resources to take a big prime number and find its factors using a normal,
13
non-quantum computer. Finding the prime numbers that multiply together to give you an
arbitrary, big, non-prime number appears to be slow, Unlike multiplication, which is very fast.
13
In currently available quantum computers the best approach available is very slow. And that’s
due to something called “Shor’s Algorithm.” Shor’s algorithm uses quantum superposition and
13
interference. RSA encrypts data by using two large prime numbers, in such a way that
29
decrypting the data requires knowing the factors of those numbers. Quantum computation has
29
the potential to make it super easy to access encrypted data. This algorithm allows us to find
the prime factors of a large number so if we have a number n equals PQ where PQ are both
20 49
prime, Shor's algorithm allows us to find P and Q. A variant of Shor’s algorithm uses 2n+3
70
qubits, if N=PQ. Shor’s algorithm evaluates a periodic function.

35
2.1.2 Grover’s Algorithm
38
One crucial aspect of transition involves the development and adoption of new cryptographic
algorithms resilient to quantum attacks. It is crucial to start making a switch to quantum
16
resistant.Grover's algorithm, developed by Lov Grover in 1996, is a quantum algorithm
71
designed to search an unsorted database quadratically faster than classical algorithms. It
provides a quadratic speedup by leveraging quantum parallelism and amplitude
47
amplification.In a classical scenario, searching an unsorted database of N items requires O(N)
16
operations. Grover's algorithm, however, achieves this in approximately √N quantum
operations. It employs a quantum oracle to mark the solution states and an iterative quantum
diffusion operator to amplify the amplitude of the marked states.The algorithm begins with a
19
superposition of all possible states. The quantum oracle then flips the sign of the target states,
effectively marking them. Subsequently, the diffusion operator spreads the amplitude across all
states, with increased emphasis on the marked ones. Repeating these oracle and diffusion steps
16
√N times allows Grover's algorithm to converge towards the solution states. Grover's algorithm
has applications in database searching, optimization, and cryptographic protocols. It represents
a fundamental quantum algorithm showcasing the potential advantages of quantum computing
over classical counterparts in certain computational tasks.

2.2 SWITCHING TO POST QUANTUM CRYPTOGRAPHY


38
One crucial aspect of transition involves the development and adoption of new cryptographic
10
algorithms resilient to quantum attacks. The migration process has to ensure that all relevant
systems that need a migration to PQC actually get migrated. It is crucial to start making a
switch to quantum resistant cryptography, considering the far horizon projects which will be
built now and will be in use for a significant amount of years. It will not be feasible to switch
to better algorithms once they are made. critical national infrastructure projects are examples.

10
While migrating an application, connected applications need to stay able to communicate
with the migrated application. Interoperability needs to be ensured for the organisation as a
whole and must not interrupt the operation of the organisation’s business processes. IETF is
28
responsible for secure TLS protocol, and has several ongoing initiatives to integrate post
28
quantum primitives in different protocols. In TLS and internet key exchange, it is proposed to
combine RSA and ECC based encryption standards. Algorithms proposed and studied by
NIST have promising future and growth scope.

26
The following table, Table 1 describes the impact of Shor algorithm and Grover’s algorithm
on existing classical encryption techniques, the impact created and adversary produced has
not been determined in absolute value but rather a relative idea. Table 1 represents the
security concern on various cryptography algorithms like RSA, ECC.
60
Name Purpose Pre- Quantum Post -quantum Impact from
security level security level large-sale quantum
computer

20
AES Encryption 128 64(Grover) Larger key sizes
needed

RSA Signatures, key 128 Broken(Shor) No longer secure


establishment

DSA Signatures, key 128 Broken(Shor) No longer secure


exchange

ECDH Key exchange 128 Broken(Shor) No longer secure

Table 2.1: Impact of classical cryptography algorithms on Grover’s and Shor’s algorithm

2.3 THE PQXDH KEY AGREEMENT PROTOCOL

PQXDH is a Post-Quantum Extended Diffie-Hellman key exchange algorithm developed by


signal. In which they augmented their existing ECC (Elliptical Curve Cryptography)
cryptography with Crystal's Kuiper protocol which is a NIST post quantum finalist which
claims to offer security levels that are roughly equivalent to AES making it one of the most
secure end-to-end encrypted messaging applications. It is developed for asynchronous
communications where one user may be offline but the receiver wants to access the
information published by the former user. The later user uses the published information to
send the encrypted data to the receiver, which in turn also establishes a shared secret key for
further communications.

In some editions of this system server’s role may be shared by multiple entities in the system.
Each party i.e sender and receiver are given long term identity as elliptic curve public key.
The signed pre keys are changed periodically, and signed for each instance when it is to be
used. For every instance of protocol run the receiver also generates a new key pair called,
66
ephemeral key pair with the public key. The protocol can be summarised in three steps as :-

1. Receiver establishes its pre keys, elliptic curve identity key and publishes them to the
server.
41
2. After the keys are published, the sender, when fetching the prekey bundle from the
server, uses it to send an initial message to the receiver.
3. Initial message is received and processed.

21
This algorithm has been proved well secured for authentication and secrecy in the symbolic
model and enumerate the precise conditions. Authentication is also considered before the
exchange of messages by use of their public keys. The party’s public key fingerprints may be
checked ,manually or by even scanning a QR code.

22
CHAPTER 3

LITERATURE REVIEW

3.1 NIST STANDARDISATION


NIST is the U.S governmental agency that has realised the impact of exponential growth of
quantum computers and qubits associated with them. The institution is working to be a step
ahead of this growth and to protect the data from any kind of quantum attack possible in the
1
future. It is working on standardisation of one or more public key cryptographic algorithms
through a public competition process. The new systems required are supposed to satisfy
existing DSS and FIPS 186-4. This competition started in November of 2017, with 82 total
algorithms submitted. Recently NIST announced this 6 year long competition has resulted in
selection of 4 post quantum algorithms, these can potentially stand the assault of a future
quantum computer.

3.1.1 First Round

The organisation identified and worked on three broad aspects of evaluation criterion for
comparison and selection of candidate’s algorithm through the first round.

Security

When considering classical algorithms, security has been considered the most important
1
factor for any evaluation set. The standardised protocol has to be used in variety of protocols
such as Transport Layer Security (TLS), Secure Shell (SSH), Internet Key Exchange (IKE),
Internet Protocol Security (IPsec), and Domain Name System Security Extensions
(DNSSEC). Preliminary classification was provided as stated in FRN-Dec16, with a focus to
meet initial categories.

Cost and Performance

Cost including computational efficiency and memory requirement were the second most
important factor recognised by the institution. This majorly consists of size of public keys,
cipher text, probability of decryption failures. Cost and performance also considers the speed

23
1
of an algorithm proposed. Memory requirements refer both to code size and random access
memory (RAM) , these are important aspects of software implementation.

Algorithm and Implementation Characteristic

Numerous candidate algorithms with novel and intriguing designs as well as special
characteristics not seen in the current NIST standardised public-key algorithms were
submitted to the NIST PQC Standardization Process. More adaptable candidate algorithms
were chosen over less adaptable ones. This covers parallelism and instruction set expansions
as means of achieving faster performance, as well as algorithms that may operate effectively
across a broad range of platforms. Furthermore, designs that are straightforward and
sophisticated are preferred since they demonstrate the design team's increased understanding
and confidence and promote additional research.

3.1.2 Second Round

Security
2
Diversity in computational hardness assumptions is also one of NIST's top long-term security
objectives for its standards. By standardising practically efficient techniques from many
cryptosystem families, NIST seeks to lessen the likelihood that a single cryptanalysis
discovery will leave the world without a workable standard for digital signatures or key
establishment. According to NIST, this approach best strikes a compromise between the
requirement that all standards undergo extensive testing prior to publication and the demand
for diversity.

Cost and Performance


2
During the second round of the NIST PQC Standardization Process, more information about
the computational efficiency of the algorithms became available. Faster, constant-time
implementations on Intel x64 processors were provided for many of the algorithms, as were
ARM Cortex-M4 and hardware implementations. These new implementations provided better
information not only about the performance of the different algorithms but also about the
resources required by implementations (RAM or gate counts). NIST hoped to see more and
better data for performance in the third round. This performance data will hopefully include
implementations that protect against side-channel attacks, such as timing attacks, power
24
2
monitoring attacks, fault attacks, etc. For general purpose, the cost of transferring the public
key was considered in addition to signature or cipher text during each transaction.

Algorithm and Implementation Characteristic


2
Most existing implementations do not provide protection against other types of side channel
33
attacks, like power analysis. For further rounds it was expected to collect more information
about implementation of algorithms so that such attacks can be resisted. Potential impact on
33
performance had been examined by NIST. Some algorithms would cause major problems
while being in such a condition which filtered out even more.

3.1.3 Third Round

Security

Throughout the competition security has been the most important criterion used by NIST in
2
any algorithm. Public key standards by NIST have been used in many internet protocols like
TLS, SSH, IKE as well as for certificates, software code signing and secure boot-loaders. In
2
this round NIST gave three possible definitions for security, two for encryption and one for
1
signatures. It asked for “semantically secure” schemes considering adaptive ciphertext attack.
2
NIST strengths are defined in a way that leaves open the relative cost of various
computational resources including quantum gates.

2
If one has agreed upon a model or a range of models for evaluating the relative cost of
various computational resources, there may still be uncertainty how much of a given resource
2
an attack actually requires. Progress was also made in clarifying some outstanding security
questions during the third round. In lattice-based cryptography, methods were developed to
replace the asymptotic security estimates represented by the core SVP methodology with
concrete security estimates expressed as a gate count that can be more directly compared with
security estimates for the non-lattice candidates.

1
Cost and Performance

Cost was identified as the second most important criterion for every comparison. Till the third
round more information about computational efficiency was clear. Faster, constant-time

25
2 2
implementations were provided for many of the algorithms. For general-purpose use, the
evaluation of overall performance considered the cost of transferring the public key in
33
addition to the signature or ciphertext during each transaction. Key generation was also taken
into account.

Algorithm and Implementation Characteristic

Earlier in the standards cycle, NIST requested side channel study from the community. The
2
community responded to the third round (and earlier) with a plethora of papers and other
technical works that examined side-channel attacks against the candidates as well as
2
strategies for protecting implementations from such assaults. Finding any algorithmic
features that would help (or hurt) the deployment of side-channel-resistant versions of any
proposed algorithm in the future had been one of the primary goals.

72
3.2 SIKE
8
SIKE, i.e. Supersingular Isogeny Key Encapsulation, is a public-key cryptosystem based on
24
the mathematical theory of elliptic curve isogenies. It is one of the 17 second round
candidates for public key encryption post-quantum cryptography proposals submitted to the
post-quantum cryptography standardisation process initiated by the U.S. National Institute of
Standards and Technology (NIST).

SIKE is designed as a public-key cryptosystem primarily used for key exchange protocols.
One of the primary motivations for developing SIKE is its resistance to attacks by quantum
1
computers.As the quantum computers will break currently deployed elliptic curve
cryptosystems,SIKE uses pseudo-random walks on supersingular isogeny graphs of curves,
which are resistant to quantum attacks as of now.

54
The underlying mathematical problems in SIKE are believed to remain hard even in the
15
presence of quantum computers, making it a candidate for post-quantum cryptography. The
2
security of SIKE relies on the difficulty of the Supersingular Isogeny Problem (SIP), where
53
the challenge is to compute an isogeny between two supersingular elliptic curves.

30
SIKE falls into the category of lattice-based cryptography, a class of cryptographic schemes

26
45
that leverage the hardness of lattice problems for security. Lattice-based cryptography is
considered a promising approach for achieving post-quantum security.

64 1
SIKE has the smallest public key size in comparison to all the other candidates which is less
73
than 750 bytes even for its level 5 security parameter. It can leverage existing optimised code
1
for elliptical curves which can be easily combined with the traditional elliptical curve
cryptography to create a hybrid classical/post quantum scheme.

3.3 MCELIECE
8 18
The McEliece cryptosystem is a public-key cryptosystem based on error-correcting codes.The
public key specifies a random binary Goppa code. A ciphertext is a codeword plus random
errors. The private key allows efficient decoding: extracting the codeword from the
61
ciphertext, identifying and removing the errors. It is an IND-CCA2 (indistinguishability
22
under adaptive chosen ciphertext attack) key encapsulation mechanism. A key encapsulation
mechanism can be viewed as a key-exchange protocol in which only a single message is
transmitted.

14
The McEliece system was designed to be one-way (OW-CPA), meaning that an attacker
cannot efficiently find the codeword from a ciphertext and public key, when the codeword is
1
chosen randomly. The KEM’s public key determines a random binary Goppa code and
generates a ciphertext by adding an error to a codeword. Decapsulation is done by decoding.

Security is based on the hardness of decoding a general linear code, and the assumption that a
random binary Goppa code is indistinguishable from a random linear code.Although the
1
McEliece cryptosystem has very short ciphertexts which are of order of 200 bytes but still it
seems to have a good performance for encapsulation and decapsulation. McEliece
cryptosystem has never gained much acceptance in the cryptographic community. This is
74
mainly due to large public key sizes.

25
3.4 BIKE

27
BIKE (Bit Flipping Key Exchange) is a Key Encapsulation Mechanism (KEM), is founded
on Quasi-Cyclic Moderate Density Parity-Check (QC-MDPC) codes and has been put forth
26
as part of the National Institute of Standards and Technology's (NIST) Post-Quantum
Cryptography (PQC) Standardization initiative. The BIKE framework encompasses a
30
singular version distinguished by three parameter sets (r,w,t) designed for Security Levels 1,
3, and 5. Additional parameters are associated with the specific BGF decoder aligned with the
proposal.

Primarily tailored for deployment in synchronous communication protocols such as TLS,


25
BIKE operates with ephemeral keys, generating a new public/private key pair for each key
exchange session. This model ensures forward secrecy, allowing decapsulation with a given
private key only once. Adequate for this usage, BIKE adheres to IND-CPA security, denoting
14
"indistinguishability under chosen-plaintext attack," a cornerstone in public-key encryption
schemes ensuring an adversary cannot discern between the encryptions of two distinct
plaintexts.

However, diverging from this usage model to scenarios involving key reuse or adaptation for
asynchronous protocols like email necessitates securing long-term static keys. While feasible,
these models sacrifice forward secrecy and necessitate IND-CCA security, where
55
"indistinguishability under chosen-ciphertext attack" becomes crucial. IND-CCA, akin to its
22
IND-CPA counterpart, measures the security of an encryption scheme but focuses on an
adversary's ability to distinguish between different ciphertexts rather than plaintexts. This
heightened security notion addresses an adversary's capability to interact with a decryption
oracle, obtaining insights into decrypted ciphertexts.

It's imperative to note that implementations deviating from BIKE's current specification are
not in compliance with the established standards.

3.5 HQC
1
Hamming Quasi-Cyclic is a code-based public key encryption scheme based on the hardness
of the decisional QCSD problem with parity, targeting IND-CCA2 security. It uses a
construction similar to RLWE-based PKE schemes, substituting shortness in the Hamming
28
metric for shortness in the Euclidean metric, combined with a public error correction code.

Contrary to most of the existing code-based cryptosystems such as classic McEliece, the
59 43
security HQC does not partly rely on hiding the structure of an error-correcting code. In the
HQC cryptosystem, the error-correcting code that is being used is public, and the security
43
relies on variants of the syndrome decoding problem which is an NP-hard problem.
1
HQC asserts a compelling case for its decryption failure rate being sufficiently low to achieve
1
chosen-ciphertext security. Currently, it stands as the most robust argument for CCA security
among the second-round candidate code-based cryptographic systems, where the primary
1
challenge lies in information set decoding, affecting both private key recovery and message
recovery (BIKE, HQC, and LEDAcrypt). However, it pays a significant penalty in key and
ciphertext size in comparison to the others (although it still compares very favorably in key
size and overall communication bandwidth to the candidate code-based cryptosystems based
on Goppa codes).

Possible areas for further analysis related to HQC include investigating the relation between
the search and decisional variants of the QCSD problem, and investigating the effect, if any,
of the quasi-cyclic code structure on security.

3.6 CRYSTAL KYBERS


1
Kyber is a family of key encapsulation mechanisms offering chosen ciphertext (i.e.,
IND-CCA) security based on the presumed post-quantum hardness of the Module Learning
with Errors (MLWE) problem. Kyber contains a standard Learning with Errors (LWE)-style
CPA-secure public-key encryption scheme, where the underlying algebraic object is a module
over a power- of-2 cyclotomic ring. This choice of parameters enables very efficient
computations using the Number Theoretic Transform (NTT). The noise is sampled according
to a centered binomial distribution. CCA security is achieved via a well-known variant of the
Fujisaki-Okamoto transform, where the session key is transported using an encryption-based
approach.

It is straightforward to adjust Kyber’s security strength levels. One adjustment simply varies
the rank of the underlying module (in the range {2,3,4}) and adjusts the noise distribution

29
parameter (in the range {5,4,3}, respectively). Kyber’s performance is among the most
competitive proposals for key exchange.

The security of Kyber relies on a variant of a well-studied problem. The submission offers a
tight security proof in the random oracle model (ROM) and a non-tight security proof in the
quantum random oracle model (QROM), both based on the MLWE assumption. We note that
a potential issue is that the security proof does not directly apply to Kyber itself, but rather to
a modified version of the scheme which does not compress the public key. Without this
modification, Kyber may in fact be relying on a different (or additional) rounding-like
assumption. If that is the case, this may lead to a cryptanalytic concern, as the known
reductions between MLWE and Module Learning with Rounding (MLWR) may not apply for
the parameters selected by Kyber.

30
CHAPTER 4

CONCERNS AND LIMITATIONS

4.1 CLASSICAL COMPUTER THREAT


While the selected algorithms are quantum threat resistant, the existing knowledge and data
has not been checked for safety from attacks from normal computers. Researchers are still
working on security of such algorithms against various attacks by classic computers or any
intruding. These research make sure that the ongoing selected algorithms are completely safe
for actual implementation.

4.2 SECURITY VULNERABILITIES


Chosen Plaintext Attacks: Crystal Kyber and Saber, despite being IND-CCA
(indistinguishability under chosen-ciphertext attack), are susceptible to CPA. Hackers gaining
access to key information by analyzing chosen plaintexts poses a risk.

Plaintext-Checking Attack: In some cases, the reuse of secret keys might render the systems
vulnerable to PCA, threatening the security of the algorithms.

4.3 PLAINTEXT-CHECKING ATTACK (PCA):


In some cases, the reuse of secret keys might render the systems vulnerable to PCA,
threatening the security of the algorithms.

4.4 META-PUBLIC KEY ENCRYPTION (PKE) AND ATTACKS:

Studies reveal weaknesses in the meta-PKE model, exposing vulnerabilities in algorithms like
NewHope and LAC. Modifications in these attacks extend to Saber and Crystal-Kyber.

4.5 SECURITY ASSURANCE AND VALIDATION:


The need for extensive and continuous testing, including robustness assessments against

31
various attack vectors, is crucial to validate the claimed security guarantees of these
algorithms. Security assurance involves robust testing methodologies that scrutinise the
algorithms' resilience against known and potential attacks. It encompasses rigorous analysis,
including cryptanalysis, simulation, and theoretical evaluations, aimed at uncovering
vulnerabilities that adversaries might exploit. This process is paramount in affirming the
31
algorithms' ability to withstand both classical and quantum attacks.

Validation, on the other hand, focuses on real-world implementations and practical


deployment scenarios of PQC algorithms. It involves a comprehensive assessment of the
algorithms' performance, efficiency, and compatibility across diverse platforms and systems.
Real-world validation validates the theoretical strengths of PQC algorithms in practical
environments, ensuring their feasibility and effectiveness in addressing contemporary
security challenges.

4.6 FUTURE RESEARCH DIRECTION:


Addressing Identified Vulnerabilities: Further research and development efforts should focus
on addressing the identified vulnerabilities and shortcomings in these algorithms to enhance
their quantum-safe properties.

Standardisation and Certainty: Establishing standardised protocols for testing and


certification will ensure a higher level of certainty regarding the security of PQC algorithms.

32
CHAPTER 5

CONCLUSION

6.1 SUMMARY

The report draws an outline across the important role that cryptography has in securing online
transactions and interaction between computer likes, building computers, and financial
transactions of educational records. These algorithms sustained over 4 decades but are now
threatened by the advent of quantum computers and their increasing power, seeing this
problem a switch to quantum resistant algorithms is needed. The existing algorithms which
rely on logarithmic calculations, or prime factorisation of large prime numbers will no longer
be secure once qubits used in quantum computers are increased and developed.

44
NIST(National Institute of Standards and Technology) started a competition to standardise
the post quantum cryptography, in response to the quantum threat, which resulted in selection
of 4 algorithms which can be integrated to actual computers in later phases I The report
critically examines the strategic concept of "Harvest Now, Decrypt Later," involving the
35
proactive collection of encrypted data with the anticipation of decrypting it in the future using
quantum computers. The structure of the report unfolds across five chapters, covering the
introduction of post-quantum cryptographic algorithms, related work on historical impacts
and migration strategies, a literature review on NIST standardisation efforts, an examination
of concerns and limitations of selected algorithms, and a concluding summary outlining
future steps.

6.2 FUTURE STEPS

Looking at present and future conditions, strong quantum computers are not so far away from
37
the world. IBM has unveiled the first quantum computer with more than 1,000 qubits which
is equivalent to digital bits in normal computers. Security of all existing data with classical
encryption is at stake. Focus of national institutions should be increased in standardization of
post quantum and quantum resilient algorithms all across the world. The need is to switch
33
existing RSA, ECC cryptography algorithms to more lattice based, hash based or code based
encryption techniques. Many firms have started their firm wide research on the field, like
51
Microsoft’s research team is focusing on Post Quantum Crypto VPN, Post Quantum TLS,
Post quantum SSH. Signal’s PQXDH is another algorithm which relies on discrete logarithm
problems for the authentication process. Researches are more focused on security of such
algorithms, analysing resistance to quantum as well as classical computers. Another aspect to
63
be considered is its implementation in various important sectors including important sectors
including finance, healthcare, government and more.

1. Algorithmic Refinement and Standardization:

The selected PQC algorithms represent a baseline for quantum-resistant cryptography.


Further refinement and standardisation efforts are imperative to enhance their robustness,
efficiency, and versatility. Collaborative initiatives among researchers, cryptographers, and
industry experts will play a pivotal role in establishing comprehensive standards and best
practices.

2. Security Assessments and Cryptanalysis:

Continuous evaluation and rigorous security assessments are vital to scrutinize the resilience
of PQC algorithms against emerging threats. Cryptanalysis efforts should persist to identify
vulnerabilities and ensure that PQC algorithms remain impervious to sophisticated attacks,
both classical and quantum.

3. Real-world Implementations and Integration:

Practical deployment and seamless integration of PQC algorithms into existing systems pose
challenges. Future research should focus on optimizing these algorithms for diverse
platforms, addressing interoperability concerns, and mitigating potential side-channel
vulnerabilities.

4. Quantum Computing Advancements:

With ongoing advancements in quantum computing technology, future studies should closely
monitor the development of quantum-resistant solutions. Aligning PQC advancements with
the progress in quantum computing will be instrumental in staying ahead of potential threats.

34
5. Post-Quantum Cryptography Education and Awareness:

Educating stakeholders, including developers, organisations, and end-users, about the


significance of PQC and the implications of quantum computing on cybersecurity is crucial.
Raising awareness and fostering a deeper understanding of PQC will facilitate its widespread
adoption and implementation.

6. Exploration of Alternative Approaches:


67
While the current focus revolves around lattice-based, code-based, multivariate, and
hash-based PQC, exploring alternative cryptographic paradigms and innovative approaches
may unveil novel methods for quantum-safe encryption.

7. Continued Collaboration and Research Funding:

Sustained collaboration among academia, industry, and government bodies, coupled with
adequate funding for research, remains pivotal. Encouraging interdisciplinary studies and
fostering innovation will drive the development of more resilient and efficient PQC
algorithms.

8. Ethical Considerations and Policy Formulation:

As PQC becomes a fundamental aspect of cybersecurity, ethical implications, regulatory


frameworks, and policy formulations concerning the use, governance, and global
standardisation of these algorithms warrant comprehensive deliberation.

In conclusion, the future of Post-Quantum Cryptography necessitates ongoing research


endeavours, collaborative engagements, and a proactive approach to address emerging
challenges and fortify digital security in the era of quantum computing. Efforts directed
towards refining, standardising, and advancing PQC will significantly contribute to
establishing a secure digital ecosystem resilient against quantum threats.

35
Similarity Report ID: oid:27535:47926924

36% Overall Similarity


Top sources found in the following databases:
21% Internet database 12% Publications database
Crossref database Crossref Posted Content database
26% Submitted Works database

TOP SOURCES
The sources with the highest number of matches within the submission. Overlapping sources will not be
displayed.

tsapps.nist.gov
1 8%
Internet

nvlpubs.nist.gov
2 5%
Internet

International Community School Addis Aba on 2023-04-17


3 2%
Submitted works

University of Hertfordshire on 2023-04-22


4 2%
Submitted works

Rivermount College on 2023-07-23


5 1%
Submitted works

Broward Community College on 2023-05-01


6 <1%
Submitted works

Amsterdam International Community School on 2023-09-29


7 <1%
Submitted works

researchgate.net
8 <1%
Internet

Sources overview
Similarity Report ID: oid:27535:47926924

University of Colorado, Denver on 2023-05-07


9 <1%
Submitted works

arxiv.org
10 <1%
Internet

emeritus.org
11 <1%
Internet

0xvimarsh.com
12 <1%
Internet

medium.com
13 <1%
Internet

pqcrypto.eu.org
14 <1%
Internet

Seema B. Hegde, Aayush Jamuar, Raghunath Kulkarni. "Post Quantum ...


15 <1%
Crossref

University of Hertfordshire on 2023-04-21


16 <1%
Submitted works

Loyola Marymount University on 2023-05-22


17 <1%
Submitted works

casa.rub.de
18 <1%
Internet

vocal.media
19 <1%
Internet

UWC Dilijan on 2022-02-09


20 <1%
Submitted works

Sources overview
Similarity Report ID: oid:27535:47926924

thequantuminsider.com
21 <1%
Internet

Chichester College on 2023-03-27


22 <1%
Submitted works

Ivy Tech Community College Central Office on 2023-05-06


23 <1%
Submitted works

link.springer.com
24 <1%
Internet

Universidad de Salamanca on 2022-06-24


25 <1%
Submitted works

mdpi.com
26 <1%
Internet

International Community School Addis Aba on 2023-05-21


27 <1%
Submitted works

David Joseph, Rafael Misoczki, Marc Manzano, Joe Tricot et al. "Transi...
28 <1%
Crossref

UWC Dilijan on 2021-09-12


29 <1%
Submitted works

Pioneer Academics on 2023-09-09


30 <1%
Submitted works

New York Institute of Technology on 2023-07-07


31 <1%
Submitted works

Amsterdam International Community School on 2023-11-10


32 <1%
Submitted works

Sources overview
Similarity Report ID: oid:27535:47926924

Georgia Institute of Technology Main Campus on 2021-05-24


33 <1%
Submitted works

The University of Manchester on 2023-06-30


34 <1%
Submitted works

Asia Pacific University College of Technology and Innovation (UCTI) on...


35 <1%
Submitted works

Cavendish Road State High School on 2023-06-02


36 <1%
Submitted works

humanprogress.org
37 <1%
Internet

University of Southern California on 2023-12-09


38 <1%
Submitted works

University of Maryland, Global Campus on 2023-08-03


39 <1%
Submitted works

Brunel University on 2023-09-01


40 <1%
Submitted works

RMIT University on 2023-10-24


41 <1%
Submitted works

University of New South Wales on 2019-10-05


42 <1%
Submitted works

"Advances in Cryptology – ASIACRYPT 2020", Springer Science and Bu...


43 <1%
Crossref

University of Wales Institute, Cardiff on 2022-11-16


44 <1%
Submitted works

Sources overview
Similarity Report ID: oid:27535:47926924

American Public University System on 2023-10-29


45 <1%
Submitted works

Pioneer Academics on 2022-09-11


46 <1%
Submitted works

University of Iowa on 2023-10-10


47 <1%
Submitted works

University of Sheffield on 2023-08-30


48 <1%
Submitted works

University of Peloponnese on 2022-12-19


49 <1%
Submitted works

deepai.org
50 <1%
Internet

hackmag.com
51 <1%
Internet

project-archive.inf.ed.ac.uk
52 <1%
Internet

"Advances in Cryptology – ASIACRYPT 2017", Springer Science and Bu...


53 <1%
Crossref

"Cryptographic Hardware and Embedded Systems – CHES 2017", Sprin...


54 <1%
Crossref

"Trusted Computing and Information Security", Springer Science and B...


55 <1%
Crossref

Atlantic Technological University on 2023-06-21


56 <1%
Submitted works

Sources overview
Similarity Report ID: oid:27535:47926924

Napier University on 2023-08-04


57 <1%
Submitted works

New Jersey Institute of Technology on 2004-12-18


58 <1%
Submitted works

Queen's University of Belfast on 2023-04-17


59 <1%
Submitted works

Silong Li, Yuxiang Chen, Lin Chen, Jing Liao, Chanchan Kuang, Kuanchi...
60 <1%
Crossref

datatracker.ietf.org
61 <1%
Internet

Kingston University on 2023-05-03


62 <1%
Submitted works

Kozep-europai Egyetem on 2018-12-21


63 <1%
Submitted works

Monash University on 2020-10-09


64 <1%
Submitted works

Pinewood International School of Thessaloniki on 2023-11-16


65 <1%
Submitted works

University of Lugano on 2020-11-10


66 <1%
Submitted works

University of St. Gallen on 2020-07-29


67 <1%
Submitted works

cmapspublic3.ihmc.us
68 <1%
Internet

Sources overview
Similarity Report ID: oid:27535:47926924

American University of Beirut on 2023-07-29


69 <1%
Submitted works

Daniel J. Bernstein, Tanja Lange. "Post-quantum cryptography", Nature...


70 <1%
Crossref

University of Florida on 2022-12-06


71 <1%
Submitted works

University of Maryland, Global Campus on 2023-03-03


72 <1%
Submitted works

Joe K. Cheng, Elaine M. Lim, Yogi Y. Krikorian, Dean J. Sklar, Vincent J....
73 <1%
Crossref

Napier University on 2022-04-05


74 <1%
Submitted works

Sources overview

You might also like