12 PDF
12 PDF
12.pdf mb
26 Pages 211.9KB
Dec 12, 2023 2:25 PM GMT+5:30 Dec 12, 2023 2:28 PM GMT+5:30
Summary
57
CHAPTER 1
INTRODUCTION
1.1 BACKGROUND
Until the 1970s, the conventional method for sharing private information involved meeting in
person to exchange a secret key, which served the dual purpose of encrypting and decrypting
messages—a practice known as a symmetric key algorithm. The security of the messages
relied on the confidentiality of this shared key.
69
In 1977, three scientists, Riverst, Shamir, and Adelman, made a significant breakthrough in
encryption. This groundbreaking method, now recognized by their initials as RSA, operates
as follows.
Every person has two really big prime numbers, all their own which they keep secret. They
3
multiply these numbers together to get an even bigger number, which they make public for
everyone to see. If we want to send someone a private message, we use their big public
3
number to garble our message. And we garble it in such a way that it is impossible to
36
ungarble without knowing the two prime factors that made that number. This is an
asymmetric key system, since different keys are used to encrypt and decrypt the message. So
it's easy for our intended recipient to decode, but impossible for everyone else, unless they
10
can factor that large public number.
52
Now, someone could try to factor it, using a supercomputer, in the best-known factoring
23
algorithm the General Number Field Sieve, but modern cryptography uses prime numbers
that are around 313 digits long. Factoring a product of two primes this big, even with a
3
supercomputer, would take around 16 million years, but not on a quantum computer. In
normal computers, a bit can only be in one state at a time, either a zero or a one. So if we had
two bits, they could be in one of four possible states, 00, 01, 10 or 11. If we assume each of
3
these states to represent a number, 0, 1, 2, or 3 and if we want to do a calculation, for
3
example, raising seven to the power of one of these numbers, we can only do it for one state
at a time, in this case seven squared and so we get the answer 49.
3
Quantum computers consist of qubits which also have two states, zero or one. But unlike a
classical bit, a qubit doesn't have to be in just one state at a time. It can be in an arbitrary
3
combination of those states, a superposition, of zero and one. So if we have two qubits, they
can exist simultaneously in a superposition of 0, 1, 2, and 3. When we repeat the same
3
calculation, it will perform the calculation for all of those numbers at the same time. And
we're left with a superposition of the different answers. 1, 7, 49 and 343. If we add another
qubit, we double the number of possible states. So with three qubits, we can represent eight
9
states, and thus perform eight calculations all at once. Increase that number to 20 qubits, and
we can already represent over a million different states, meaning we can simultaneously
compute over a million different answers.
5
With 300 qubits, we can represent more states than there are particles in the observable
universe. This sounds incredibly powerful and it is, but all of the answers to the computation
5 7
are embedded in a superposition of states, but we can't read out this superposition. When we
make a measurement, we only get a single value from the superposition at random, and all the
9
other information is lost. In order to harness the power of a quantum computer, we need a
smart way to convert a superposition of states into one that contains only the information we
want. This is an incredibly difficult task, which is why for most applications, quantum
computers are useless.
So far, we've only identified a few problems, where we can actually do this, but these are
11
56
precisely the problems that form the foundation of nearly all the public key cryptography we
use today.
7
In 1994, Peter Shor and Don Coppersmith figured out how to take a quantum Fourier
transform. It works just like a normal Fourier transform, apply it to some periodic signal, and
it returns the frequencies that are in that signal.
Now this may not seem particularly interesting but consider this. If we have a superposition
7
of states that is periodic, that is the terms in the superposition are separated, by some regular
amount, we can apply the quantum Fourier transform and will be left with states that contain
19
the frequency of the signal. We can measure this. The quantum Fourier transform allows us to
extract frequency information from a periodic superposition, and that is going to come in
handy.
How does a quantum computer factor the product of two primes much faster than a
conventional computer?
17
Let's say we have a number N, which is the product of two primes, p and q. Let's set N equal
27
to 77. Pick a number g that doesn't share any factors with N. If we multiply g by itself over
27
and over and over, we will always eventually reach a multiple of N plus one. In other words,
20
we can always find some exponent r, such that g to the power of r, is a multiple of N plus one.
Let's see how this works. If we pick any number that is smaller than 77, for example the
number eight. This number doesn't share factors with 77. And if we were doing this with big
6
primes, it would also be extremely unlikely that we just happen to pick a number that shares
factors with N.
Multiply eight by itself once, twice, three times four times, and so on, raising eight to ever
higher powers and then divide each of these numbers by 77. We're not interested in how
many times 77 goes into the number, just the remainder, what's left over, because at some
point, 77 should divide one of these numbers with a remainder of exactly one. So eight
divided by 77 is zero with a remainder of 8, 64 divided by 77 is zero remainder 64. 512
divided by 77 is six remainder 50. And as we keep going, we get remainders of 15, 43, 36,
12
57, 71, 29, and finally one.
We've found the exponent R that satisfies this equation. But how does this help find the
factors of N? We rearrange the equation to bring one over to the left-hand side, and then we
5
can split it into two terms. And now as long as r is even, we have one integer times another
6
integer is equal to a multiple of N. This looks similar to p times q equals N. Since we know
7
that p and q are on the right-hand side of this equation, they must also be on the left-hand side
just multiplied by some additional factors.
20
Since r was 10, the two terms on the left-hand side are eight to the power of five plus one,
34
32,769 and eight to the power of five minus one, 32,767. These two numbers probably share
factors with N.
To find them we use Euclid's algorithm. If we want to find the greatest common divisor of
58
two numbers, take 32,769 and 77, divide the bigger number by the smaller one and record the
remainder. In this case, 32,769 divided by 77 gives a remainder of 44. Then shift the numbers
one position left and repeat. Now we divide 77 by 44 and we get a remainder of 33. Repeat
6
the process again. 44 divided by 33 gives a remainder of 11 and again 33 divided by 11
6
equals three remainder zero. When the remainder is zero, the divisor is the greatest common
factor between the two numbers you started with. In this case, it's 11, which is indeed a factor
of 77 and 32,769. We could do the same procedure with the other number or just divide 77 by
11 to get seven, its other prime factor[1].
17 5
To recapitulate, if we want to find the prime factors p and q of a number N, first, make a bad
guess, g, second, find out how many times r we have to multiply g by itself to reach one more
than a multiple of N. Third, use that exponent to calculate two new numbers that probably do
share factors with N. And finally use Euclid's algorithm to find the shared factors between
those numbers and N, which should give you p and q.
Now, we don't need a quantum computer to run any of these steps, but on a classical
7
computer, this method wouldn't be any faster than other methods. The key process that a
quantum computer speeds up is step two, finding the exponent we raise G2 to equal one more
than a multiple of N. To see why, let's go back to our example, where eight to the power of 10
13
is one more than a multiple of 77. If we keep going past eight to the power of 10, to 8 to the
11, eight to the 12, and so on. We get remainders of 8, 64, 50, 15, 43, 36, 57, 71, 29, and
again one. The exponent that yields a remainder of one is 20, which is 10 more than the first
exponent that yielded a remainder of one.
So we know that 830 and 8 40, 8 raised to any power divisible by 10 will also be one more than
a multiple of 77. If you pick any remainder, for example 15, the next time we find that same
4
remainder, the exponent will have increased by 10. So we can find the exponent R that gets
us to one more than a multiple of n, by looking at the spacing of any remainder, not just one.
7
We can see this is a repeating pattern, we can go back to the beginning and any number raised
to the power of zero is one. That is actually the first remainder. So it must also appear when
32
the cycle starts again. Now we are ready to use a quantum computer to factor any large
product of two primes.
First we split up the qubits into two sets. The first set we prepare in a superposition of zero
and one and two and three and four and five and six and seven and eight and nine, all the way
15
up to 101,234. This is a huge superposition, but if we had perfect qubits, it would require only
17 6
around 4,100. The other set contains a similar number of qubits all left in the zero state for
4
now. Now we make our guess G, which most likely doesn't share factors with N. We raise G
6
to the power of the first set of qubits and then we divide by N and store the remainder in the
4
second set of qubits leaving the first set of qubits as it was. Now we have a superposition of
all the numbers we started with and the remainder of raising G to the power of those numbers
15
divided by N. And through this operation, we have entangled our two sets of qubits, but we
can't measure this superposition.
4
If we don't measure the entire superposition, but only the remaining part, we will obtain a
4
random remainder. But this remainder won't occur once. It will occur multiple times every
time it comes up in the cycle. Imagine we were doing this with the example from before with
N equals 77 and G equals eight. If the remainder we measured was say 15, then there would
be multiple terms in our superposition. Because there are multiple exponents you can raise
G2 that give this same remainder, exponents 4, 14, 24, 34, and so on. They are each separated
4
by 10, and that value is the exponent that satisfies our equation. So more generally after
14
measuring the remainder, we will be left with a superposition of states that all share the same
remainder and the exponents will all be separated by the same amount r. This is the number
we are looking for.
4
Since the remainder is now the same for all states, we can put it to the side and we now have
a superposition that is periodic. Each term is separated from its neighbours by an amount R.
If we now apply the quantum Fourier transform to this superposition of states, we will be left
with states containing one over R.
4
So all that's left to do now is perform a measurement and find R by inverting it, and that's it
15
for the quantum part. As long as r turns out to be even we can use r to turn our bad guess g
into two numbers that likely share factors with N. And as long as these terms themselves are
42
not a multiple of N, we can use Euclid's algorithm to find the factors of N and break the
encryption.
1.2 MOTIVATION
The computational strength of a quantum processor doesn't merely double with the addition
of qubits; rather, it experiences an exponential increase in power when a single qubit is
introduced. Quantum processors can execute quantum algorithms at twice the speed for every
additional qubit, rendering them exceptionally efficient in addressing cryptographic
challenges. The requirement for this heightened capability involves several thousand flawless
qubits. However, the current qubits exhibit imperfections, necessitating the incorporation of
extra qubits to serve as redundant information.
In 2012, the assessment indicated a necessity for a billion physical qubits to compromise
RSA encryption. Within five years, this estimate diminished to 230 million, and further
advancements in 2019 lowered the figure to a mere 20 million physical qubits. Despite the
progress observed in IBM's quantum computers, the present qubit count remains far below
this requirement. Nonetheless, the trajectory of advancement suggests exponential growth,
prompting speculation about when the convergence of these curves will occur, rendering
existing public key encryption vulnerable.
15
Acknowledging the impending threat, scientists have actively sought novel methods of
19
encrypting data resilient to both conventional and quantum computer attacks. In 2016, the
National Institute of Standards and Technology (NIST) initiated a competition to identify
encryption algorithms impervious to quantum computers. The global cryptographic
community submitted 82 diverse proposals, subjected to rigorous testing that led to the
identification of four algorithms on July 5th, 2022, subsequently integrated into NIST's
post-quantum cryptographic standard.
Figure 1
11
1.3 HARVEST NOW, DECRYPT LATER
The concept of harvest now decrypting later takes on renewed significance as quantum
21
computing threatens traditional cryptographic methods.This theory centers on concerns that a
nation-state will gain access to currently encrypted data and then decrypt it at a later time
using a quantum computer. It embodies the proactive collection of data or information with
11
the anticipation of decrypting or processing it in the future. It involves long term storage of
encrypted data that is currently unreadable in the hopes that future advances in decryption
technology will make it readable.Although quantum computers are still in the research and
development stage, they threaten encrypted data now. As the landscape of cryptography
evolves, adopting this approach involves a strategic amalgamation of accumulating data and
16
applying encryption methodologies resilient to quantum threats, thereby ensuring the
8
integrity and confidentiality of information. Failure to transition before sufficiently powerful
quantum computers are realized will jeopardize the security of public key cryptosystems
which are widely deployed within communication protocols, digital signing mechanisms,
authentication frameworks, and more.
Within this overarching exploration, the report contemplates the implications of "Harvest
Now, Decrypt Later" on the NIST-selected PQC algorithms. While these algorithms exhibit
resilience against current cryptographic threats, their susceptibility to harvesting attacks casts
a shadow on the long-term security of encrypted data. This nuanced perspective adds a layer
30
of complexity to the ongoing discourse surrounding the future of cryptographic standards.
The report meticulously analyses the strategies and countermeasures that can be employed to
mitigate the risks posed by "Harvest Now, Decrypt Later." It scrutinizes the cryptographic
assumptions and mechanisms underpinning NIST-selected candidates, shedding light on their
vulnerability to potential future decryption methods. This foresight-driven analysis serves as
a crucial lens through which the robustness of PQC algorithms can be evaluated not only
against the imminent quantum threat but also within the broader context of strategic data
harvesting.
46
Chapter 2 of this report covers the related work in the above described field. With the coming
of Shor’s algorithm and Grover’s algorithm, institutions started recognizing the threat
imposed on existing data by quantum computers. The next section will discuss migration
management from algorithms unsafe from quantum computers to quantum safe algorithms.
This chapter also contains the development of other significant elements like PQXDH key
agreement protocol by Signal.
50
Chapter 3 of this report deals with literature review in the field of Post quantum
cryptography, the initiative by NIST to standardise the post quantum cryptographic
62
algorithms to address the concerns about the potential future threat quantum computers might
pose to current cryptographic systems. The following section studies the standardisation
parameters considered by NIST in the three rounds conducted. It is followed by the 4
qualified algorithms by NIST, the idea behind them and the security.
Chapter 4 of this report talks about the concerns and limitations of the proposed algorithms or
standardised by NIST. Multiple threats to the algorithms are discussed including plain text
attack. These threats can make the algorithms moot for use in real systems if security is
compromised to a large limit.
Chapter 5 summarises and concludes the survey and the world done till now. First part
summarised the algorithms and their characteristics along with the standardisation process.
Future steps talk about the potential future development and the scope in this field of
quantum studies including deep security assessment and spreading awareness about the
implementation of quantum algorithms across global institutions.
18
CHAPTER 2
RELATED WORK
35
2.1.2 Grover’s Algorithm
38
One crucial aspect of transition involves the development and adoption of new cryptographic
algorithms resilient to quantum attacks. It is crucial to start making a switch to quantum
16
resistant.Grover's algorithm, developed by Lov Grover in 1996, is a quantum algorithm
71
designed to search an unsorted database quadratically faster than classical algorithms. It
provides a quadratic speedup by leveraging quantum parallelism and amplitude
47
amplification.In a classical scenario, searching an unsorted database of N items requires O(N)
16
operations. Grover's algorithm, however, achieves this in approximately √N quantum
operations. It employs a quantum oracle to mark the solution states and an iterative quantum
diffusion operator to amplify the amplitude of the marked states.The algorithm begins with a
19
superposition of all possible states. The quantum oracle then flips the sign of the target states,
effectively marking them. Subsequently, the diffusion operator spreads the amplitude across all
states, with increased emphasis on the marked ones. Repeating these oracle and diffusion steps
16
√N times allows Grover's algorithm to converge towards the solution states. Grover's algorithm
has applications in database searching, optimization, and cryptographic protocols. It represents
a fundamental quantum algorithm showcasing the potential advantages of quantum computing
over classical counterparts in certain computational tasks.
10
While migrating an application, connected applications need to stay able to communicate
with the migrated application. Interoperability needs to be ensured for the organisation as a
whole and must not interrupt the operation of the organisation’s business processes. IETF is
28
responsible for secure TLS protocol, and has several ongoing initiatives to integrate post
28
quantum primitives in different protocols. In TLS and internet key exchange, it is proposed to
combine RSA and ECC based encryption standards. Algorithms proposed and studied by
NIST have promising future and growth scope.
26
The following table, Table 1 describes the impact of Shor algorithm and Grover’s algorithm
on existing classical encryption techniques, the impact created and adversary produced has
not been determined in absolute value but rather a relative idea. Table 1 represents the
security concern on various cryptography algorithms like RSA, ECC.
60
Name Purpose Pre- Quantum Post -quantum Impact from
security level security level large-sale quantum
computer
20
AES Encryption 128 64(Grover) Larger key sizes
needed
Table 2.1: Impact of classical cryptography algorithms on Grover’s and Shor’s algorithm
In some editions of this system server’s role may be shared by multiple entities in the system.
Each party i.e sender and receiver are given long term identity as elliptic curve public key.
The signed pre keys are changed periodically, and signed for each instance when it is to be
used. For every instance of protocol run the receiver also generates a new key pair called,
66
ephemeral key pair with the public key. The protocol can be summarised in three steps as :-
1. Receiver establishes its pre keys, elliptic curve identity key and publishes them to the
server.
41
2. After the keys are published, the sender, when fetching the prekey bundle from the
server, uses it to send an initial message to the receiver.
3. Initial message is received and processed.
21
This algorithm has been proved well secured for authentication and secrecy in the symbolic
model and enumerate the precise conditions. Authentication is also considered before the
exchange of messages by use of their public keys. The party’s public key fingerprints may be
checked ,manually or by even scanning a QR code.
22
CHAPTER 3
LITERATURE REVIEW
The organisation identified and worked on three broad aspects of evaluation criterion for
comparison and selection of candidate’s algorithm through the first round.
Security
When considering classical algorithms, security has been considered the most important
1
factor for any evaluation set. The standardised protocol has to be used in variety of protocols
such as Transport Layer Security (TLS), Secure Shell (SSH), Internet Key Exchange (IKE),
Internet Protocol Security (IPsec), and Domain Name System Security Extensions
(DNSSEC). Preliminary classification was provided as stated in FRN-Dec16, with a focus to
meet initial categories.
Cost including computational efficiency and memory requirement were the second most
important factor recognised by the institution. This majorly consists of size of public keys,
cipher text, probability of decryption failures. Cost and performance also considers the speed
23
1
of an algorithm proposed. Memory requirements refer both to code size and random access
memory (RAM) , these are important aspects of software implementation.
Numerous candidate algorithms with novel and intriguing designs as well as special
characteristics not seen in the current NIST standardised public-key algorithms were
submitted to the NIST PQC Standardization Process. More adaptable candidate algorithms
were chosen over less adaptable ones. This covers parallelism and instruction set expansions
as means of achieving faster performance, as well as algorithms that may operate effectively
across a broad range of platforms. Furthermore, designs that are straightforward and
sophisticated are preferred since they demonstrate the design team's increased understanding
and confidence and promote additional research.
Security
2
Diversity in computational hardness assumptions is also one of NIST's top long-term security
objectives for its standards. By standardising practically efficient techniques from many
cryptosystem families, NIST seeks to lessen the likelihood that a single cryptanalysis
discovery will leave the world without a workable standard for digital signatures or key
establishment. According to NIST, this approach best strikes a compromise between the
requirement that all standards undergo extensive testing prior to publication and the demand
for diversity.
Security
Throughout the competition security has been the most important criterion used by NIST in
2
any algorithm. Public key standards by NIST have been used in many internet protocols like
TLS, SSH, IKE as well as for certificates, software code signing and secure boot-loaders. In
2
this round NIST gave three possible definitions for security, two for encryption and one for
1
signatures. It asked for “semantically secure” schemes considering adaptive ciphertext attack.
2
NIST strengths are defined in a way that leaves open the relative cost of various
computational resources including quantum gates.
2
If one has agreed upon a model or a range of models for evaluating the relative cost of
various computational resources, there may still be uncertainty how much of a given resource
2
an attack actually requires. Progress was also made in clarifying some outstanding security
questions during the third round. In lattice-based cryptography, methods were developed to
replace the asymptotic security estimates represented by the core SVP methodology with
concrete security estimates expressed as a gate count that can be more directly compared with
security estimates for the non-lattice candidates.
1
Cost and Performance
Cost was identified as the second most important criterion for every comparison. Till the third
round more information about computational efficiency was clear. Faster, constant-time
25
2 2
implementations were provided for many of the algorithms. For general-purpose use, the
evaluation of overall performance considered the cost of transferring the public key in
33
addition to the signature or ciphertext during each transaction. Key generation was also taken
into account.
Earlier in the standards cycle, NIST requested side channel study from the community. The
2
community responded to the third round (and earlier) with a plethora of papers and other
technical works that examined side-channel attacks against the candidates as well as
2
strategies for protecting implementations from such assaults. Finding any algorithmic
features that would help (or hurt) the deployment of side-channel-resistant versions of any
proposed algorithm in the future had been one of the primary goals.
72
3.2 SIKE
8
SIKE, i.e. Supersingular Isogeny Key Encapsulation, is a public-key cryptosystem based on
24
the mathematical theory of elliptic curve isogenies. It is one of the 17 second round
candidates for public key encryption post-quantum cryptography proposals submitted to the
post-quantum cryptography standardisation process initiated by the U.S. National Institute of
Standards and Technology (NIST).
SIKE is designed as a public-key cryptosystem primarily used for key exchange protocols.
One of the primary motivations for developing SIKE is its resistance to attacks by quantum
1
computers.As the quantum computers will break currently deployed elliptic curve
cryptosystems,SIKE uses pseudo-random walks on supersingular isogeny graphs of curves,
which are resistant to quantum attacks as of now.
54
The underlying mathematical problems in SIKE are believed to remain hard even in the
15
presence of quantum computers, making it a candidate for post-quantum cryptography. The
2
security of SIKE relies on the difficulty of the Supersingular Isogeny Problem (SIP), where
53
the challenge is to compute an isogeny between two supersingular elliptic curves.
30
SIKE falls into the category of lattice-based cryptography, a class of cryptographic schemes
26
45
that leverage the hardness of lattice problems for security. Lattice-based cryptography is
considered a promising approach for achieving post-quantum security.
64 1
SIKE has the smallest public key size in comparison to all the other candidates which is less
73
than 750 bytes even for its level 5 security parameter. It can leverage existing optimised code
1
for elliptical curves which can be easily combined with the traditional elliptical curve
cryptography to create a hybrid classical/post quantum scheme.
3.3 MCELIECE
8 18
The McEliece cryptosystem is a public-key cryptosystem based on error-correcting codes.The
public key specifies a random binary Goppa code. A ciphertext is a codeword plus random
errors. The private key allows efficient decoding: extracting the codeword from the
61
ciphertext, identifying and removing the errors. It is an IND-CCA2 (indistinguishability
22
under adaptive chosen ciphertext attack) key encapsulation mechanism. A key encapsulation
mechanism can be viewed as a key-exchange protocol in which only a single message is
transmitted.
14
The McEliece system was designed to be one-way (OW-CPA), meaning that an attacker
cannot efficiently find the codeword from a ciphertext and public key, when the codeword is
1
chosen randomly. The KEM’s public key determines a random binary Goppa code and
generates a ciphertext by adding an error to a codeword. Decapsulation is done by decoding.
Security is based on the hardness of decoding a general linear code, and the assumption that a
random binary Goppa code is indistinguishable from a random linear code.Although the
1
McEliece cryptosystem has very short ciphertexts which are of order of 200 bytes but still it
seems to have a good performance for encapsulation and decapsulation. McEliece
cryptosystem has never gained much acceptance in the cryptographic community. This is
74
mainly due to large public key sizes.
25
3.4 BIKE
27
BIKE (Bit Flipping Key Exchange) is a Key Encapsulation Mechanism (KEM), is founded
on Quasi-Cyclic Moderate Density Parity-Check (QC-MDPC) codes and has been put forth
26
as part of the National Institute of Standards and Technology's (NIST) Post-Quantum
Cryptography (PQC) Standardization initiative. The BIKE framework encompasses a
30
singular version distinguished by three parameter sets (r,w,t) designed for Security Levels 1,
3, and 5. Additional parameters are associated with the specific BGF decoder aligned with the
proposal.
However, diverging from this usage model to scenarios involving key reuse or adaptation for
asynchronous protocols like email necessitates securing long-term static keys. While feasible,
these models sacrifice forward secrecy and necessitate IND-CCA security, where
55
"indistinguishability under chosen-ciphertext attack" becomes crucial. IND-CCA, akin to its
22
IND-CPA counterpart, measures the security of an encryption scheme but focuses on an
adversary's ability to distinguish between different ciphertexts rather than plaintexts. This
heightened security notion addresses an adversary's capability to interact with a decryption
oracle, obtaining insights into decrypted ciphertexts.
It's imperative to note that implementations deviating from BIKE's current specification are
not in compliance with the established standards.
3.5 HQC
1
Hamming Quasi-Cyclic is a code-based public key encryption scheme based on the hardness
of the decisional QCSD problem with parity, targeting IND-CCA2 security. It uses a
construction similar to RLWE-based PKE schemes, substituting shortness in the Hamming
28
metric for shortness in the Euclidean metric, combined with a public error correction code.
Contrary to most of the existing code-based cryptosystems such as classic McEliece, the
59 43
security HQC does not partly rely on hiding the structure of an error-correcting code. In the
HQC cryptosystem, the error-correcting code that is being used is public, and the security
43
relies on variants of the syndrome decoding problem which is an NP-hard problem.
1
HQC asserts a compelling case for its decryption failure rate being sufficiently low to achieve
1
chosen-ciphertext security. Currently, it stands as the most robust argument for CCA security
among the second-round candidate code-based cryptographic systems, where the primary
1
challenge lies in information set decoding, affecting both private key recovery and message
recovery (BIKE, HQC, and LEDAcrypt). However, it pays a significant penalty in key and
ciphertext size in comparison to the others (although it still compares very favorably in key
size and overall communication bandwidth to the candidate code-based cryptosystems based
on Goppa codes).
Possible areas for further analysis related to HQC include investigating the relation between
the search and decisional variants of the QCSD problem, and investigating the effect, if any,
of the quasi-cyclic code structure on security.
It is straightforward to adjust Kyber’s security strength levels. One adjustment simply varies
the rank of the underlying module (in the range {2,3,4}) and adjusts the noise distribution
29
parameter (in the range {5,4,3}, respectively). Kyber’s performance is among the most
competitive proposals for key exchange.
The security of Kyber relies on a variant of a well-studied problem. The submission offers a
tight security proof in the random oracle model (ROM) and a non-tight security proof in the
quantum random oracle model (QROM), both based on the MLWE assumption. We note that
a potential issue is that the security proof does not directly apply to Kyber itself, but rather to
a modified version of the scheme which does not compress the public key. Without this
modification, Kyber may in fact be relying on a different (or additional) rounding-like
assumption. If that is the case, this may lead to a cryptanalytic concern, as the known
reductions between MLWE and Module Learning with Rounding (MLWR) may not apply for
the parameters selected by Kyber.
30
CHAPTER 4
Plaintext-Checking Attack: In some cases, the reuse of secret keys might render the systems
vulnerable to PCA, threatening the security of the algorithms.
Studies reveal weaknesses in the meta-PKE model, exposing vulnerabilities in algorithms like
NewHope and LAC. Modifications in these attacks extend to Saber and Crystal-Kyber.
31
various attack vectors, is crucial to validate the claimed security guarantees of these
algorithms. Security assurance involves robust testing methodologies that scrutinise the
algorithms' resilience against known and potential attacks. It encompasses rigorous analysis,
including cryptanalysis, simulation, and theoretical evaluations, aimed at uncovering
vulnerabilities that adversaries might exploit. This process is paramount in affirming the
31
algorithms' ability to withstand both classical and quantum attacks.
32
CHAPTER 5
CONCLUSION
6.1 SUMMARY
The report draws an outline across the important role that cryptography has in securing online
transactions and interaction between computer likes, building computers, and financial
transactions of educational records. These algorithms sustained over 4 decades but are now
threatened by the advent of quantum computers and their increasing power, seeing this
problem a switch to quantum resistant algorithms is needed. The existing algorithms which
rely on logarithmic calculations, or prime factorisation of large prime numbers will no longer
be secure once qubits used in quantum computers are increased and developed.
44
NIST(National Institute of Standards and Technology) started a competition to standardise
the post quantum cryptography, in response to the quantum threat, which resulted in selection
of 4 algorithms which can be integrated to actual computers in later phases I The report
critically examines the strategic concept of "Harvest Now, Decrypt Later," involving the
35
proactive collection of encrypted data with the anticipation of decrypting it in the future using
quantum computers. The structure of the report unfolds across five chapters, covering the
introduction of post-quantum cryptographic algorithms, related work on historical impacts
and migration strategies, a literature review on NIST standardisation efforts, an examination
of concerns and limitations of selected algorithms, and a concluding summary outlining
future steps.
Looking at present and future conditions, strong quantum computers are not so far away from
37
the world. IBM has unveiled the first quantum computer with more than 1,000 qubits which
is equivalent to digital bits in normal computers. Security of all existing data with classical
encryption is at stake. Focus of national institutions should be increased in standardization of
post quantum and quantum resilient algorithms all across the world. The need is to switch
33
existing RSA, ECC cryptography algorithms to more lattice based, hash based or code based
encryption techniques. Many firms have started their firm wide research on the field, like
51
Microsoft’s research team is focusing on Post Quantum Crypto VPN, Post Quantum TLS,
Post quantum SSH. Signal’s PQXDH is another algorithm which relies on discrete logarithm
problems for the authentication process. Researches are more focused on security of such
algorithms, analysing resistance to quantum as well as classical computers. Another aspect to
63
be considered is its implementation in various important sectors including important sectors
including finance, healthcare, government and more.
Continuous evaluation and rigorous security assessments are vital to scrutinize the resilience
of PQC algorithms against emerging threats. Cryptanalysis efforts should persist to identify
vulnerabilities and ensure that PQC algorithms remain impervious to sophisticated attacks,
both classical and quantum.
Practical deployment and seamless integration of PQC algorithms into existing systems pose
challenges. Future research should focus on optimizing these algorithms for diverse
platforms, addressing interoperability concerns, and mitigating potential side-channel
vulnerabilities.
With ongoing advancements in quantum computing technology, future studies should closely
monitor the development of quantum-resistant solutions. Aligning PQC advancements with
the progress in quantum computing will be instrumental in staying ahead of potential threats.
34
5. Post-Quantum Cryptography Education and Awareness:
Sustained collaboration among academia, industry, and government bodies, coupled with
adequate funding for research, remains pivotal. Encouraging interdisciplinary studies and
fostering innovation will drive the development of more resilient and efficient PQC
algorithms.
35
Similarity Report ID: oid:27535:47926924
TOP SOURCES
The sources with the highest number of matches within the submission. Overlapping sources will not be
displayed.
tsapps.nist.gov
1 8%
Internet
nvlpubs.nist.gov
2 5%
Internet
researchgate.net
8 <1%
Internet
Sources overview
Similarity Report ID: oid:27535:47926924
arxiv.org
10 <1%
Internet
emeritus.org
11 <1%
Internet
0xvimarsh.com
12 <1%
Internet
medium.com
13 <1%
Internet
pqcrypto.eu.org
14 <1%
Internet
casa.rub.de
18 <1%
Internet
vocal.media
19 <1%
Internet
Sources overview
Similarity Report ID: oid:27535:47926924
thequantuminsider.com
21 <1%
Internet
link.springer.com
24 <1%
Internet
mdpi.com
26 <1%
Internet
David Joseph, Rafael Misoczki, Marc Manzano, Joe Tricot et al. "Transi...
28 <1%
Crossref
Sources overview
Similarity Report ID: oid:27535:47926924
humanprogress.org
37 <1%
Internet
Sources overview
Similarity Report ID: oid:27535:47926924
deepai.org
50 <1%
Internet
hackmag.com
51 <1%
Internet
project-archive.inf.ed.ac.uk
52 <1%
Internet
Sources overview
Similarity Report ID: oid:27535:47926924
Silong Li, Yuxiang Chen, Lin Chen, Jing Liao, Chanchan Kuang, Kuanchi...
60 <1%
Crossref
datatracker.ietf.org
61 <1%
Internet
cmapspublic3.ihmc.us
68 <1%
Internet
Sources overview
Similarity Report ID: oid:27535:47926924
Joe K. Cheng, Elaine M. Lim, Yogi Y. Krikorian, Dean J. Sklar, Vincent J....
73 <1%
Crossref
Sources overview