0% found this document useful (0 votes)
41 views11 pages

How Secure Are Fpgas in Cryptographic Applications?: (Wollinger, Cpaar) @crypto - Rub.De

This document discusses the security of using FPGAs for cryptographic applications. It begins by outlining the advantages of FPGAs, such as flexibility and efficiency. However, it also describes several potential security issues with FPGAs, including readback attacks to extract configuration data, cloning of FPGA designs from unprotected bitstreams, and reverse engineering of bitstreams. It argues that while cryptanalysis of algorithms is usually not the main threat, exploitation of implementation weaknesses is a larger risk. The document aims to provide a comprehensive overview of both the benefits and open security questions around using FPGAs for cryptography.

Uploaded by

Anonymous MDEEgJ
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views11 pages

How Secure Are Fpgas in Cryptographic Applications?: (Wollinger, Cpaar) @crypto - Rub.De

This document discusses the security of using FPGAs for cryptographic applications. It begins by outlining the advantages of FPGAs, such as flexibility and efficiency. However, it also describes several potential security issues with FPGAs, including readback attacks to extract configuration data, cloning of FPGA designs from unprotected bitstreams, and reverse engineering of bitstreams. It argues that while cryptanalysis of algorithms is usually not the main threat, exploitation of implementation weaknesses is a larger risk. The document aims to provide a comprehensive overview of both the benefits and open security questions around using FPGAs for cryptography.

Uploaded by

Anonymous MDEEgJ
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

How Secure Are FPGAs in Cryptographic

Applications? ?
Thomas Wollinger and Christof Paar
Chair for Communication Security (COSY)
Horst G
ortz Institute for IT Security
Ruhr-Universit
at Bochum, Germany
{wollinger, cpaar}@crypto.rub.de

Abstract. The use of FPGAs for cryptographic applications is highly


attractive for a variety of reasons but at the same time there are many
open issues related to the general security of FPGAs. This contribution
attempts to provide a state-of-the-art description of this topic. First,
the advantages of reconfigurable hardware for cryptographic applications
are listed. Second, potential security problems of FPGAs are described
in detail, followed by a proposal of a some countermeasure. Third, a
list of open research problems is provided. Even though there have been
many contributions dealing with the algorithmic aspects of cryptographic
schemes implemented on FPGAs, this contribution appears to be the first
comprehensive treatment of system and security aspects.

Keywords: cryptography, FPGA, security, attacks, reconfigurable hardware

Introduction

The choice of the implementation platform of a digital system is driven by many


criteria and heavily dependent on the application area. In addition to the aspects
of algorithm and system speed and costs which are present in most other application domains too there are crypto-specific ones: physical security (e.g.,
against key recovery and algorithm manipulation), flexibility (regarding algorithm parameter, keys, and the algorithm itself), power consumption (absolute
usage and prevention of power analysis attacks), and other side channel leakages.
Reconfigurable hardware devices, such as Field Programmable Gate Arrays
(FPGAs), seem to combine the advantages of SW and HW implementations.
At the same time, there are still many open questions regarding FPGAs as a
module for security functions. There has been a fair amount of work done by
the research community dealing with the algorithmic and computer architecture
aspects of crypto schemes implemented on FPGAs since the mid-1990s (see,
?

In the original paper, we stated incorrectly that the Xilinx and Altera bitstream
have been reversed engineered. This information is not correct. However, the design software was reversed engineered. This research was partially sponsored by the
German Federal Office for Information Security (BSI).

e.g., relevant articles in [PG-,KP00,KNP01,KKP02]), often focusing on highperformance implementations. At the same time, however, very little work has
been done dealing with the system and physical aspects of FPGAs as they pertain
to cryptographic applications. It should be noted that the main threat to a
cryptographic scheme in the real world is not the cryptanalysis of the actual
algorithm, but rather the exploration of weaknesses of the implementation. Given
this fact, we hope that the contribution at hand is of interest to readers in
academia, industry and government sectors.
In this paper well start in Section 2 with a list of the advantages of FPGAs
in cryptographic applications from a systems perspective. Then, we highlight
important questions pertaining to the security of FPGAs when used for crypto
algorithms in Section 3. A major part of this contribution is a state-of-the-art
perspective of security issues with respect to FPGAs, by illuminating this problem from different viewpoints and by trying to transfer problems and solutions
from other hardware platforms to FPGAs (Section 4). In Section 5, we provide
a list of open problems. Finally, we end this contribution with some conclusions.
We would like to stress that this contribution is not based on any practical experiments, but on a careful analysis of available publications in the literature
and on our experience with implementing crypto algorithms.

System Advantages of FPGAs for Cryptographic


Applications

In this section we list the potential advantages of reconfigurable hardware (RCHW)


in cryptographic applications. More details and a description to each item can be
found in [EYCP01,WP03]. Note that the listed potential advantages of FPGAs
for cryptographic applications can only be exploited if the security shortcomings
of FPGAs discussed in the following have been addressed.

Algorithm Agility
Algorithm Upload
Architecture Efficiency
Resource Efficiency
Algorithm Modification
Throughput
Cost Efficiency

Security Shortcomings of FPGAs

This section summarizes security problems produced by attacks against given


FPGA implementations. First we would like to state what the possible goals of
such attacks are.

3.1

Objectives of an Attacker

The most common threat against an implementation of cryptographic algorithm


is to learn a confidential cryptographic key. Given that the algorithms applied are
publicly known in most commercial applications, knowledge of the key enables
the attacker to decrypt future and past communication. Another threat is the
one-to-one copy, or cloning, of a cryptographic algorithm together with its key.
In some cases it can be enough to run the cloned application in decryption mode
to decipher past and future communication. In other cases, execution of a certain
cryptographic operation with a presumingly secret key is in most applications the
sole criteria which authenticates a communication party. An attacker who can
perform the same function can masquerade as the attacked communication party.
Yet another threat is given in applications where the cryptographic algorithms
are proprietary e.g. pay-TV and government communication. In such scenarios it
is already interesting for an attacker to reverse-engineer the encryption algorithm
itself.
The discussion above assumes mostly that an attacker has physical access
to the encryption device. We believe that in many scenarios such access can be
assumed, either through outsiders or through dishonest insiders. In the following
we discuss vulnerabilities of modern FPGAs against such attacks.
3.2

Black Box Attack

The classical method to reverse engineer a chip is the so called Black Box attack.
The attacker inputs all possible combinations, while saving the corresponding
outputs. The intruder is then able to extract the inner logic of the FPGA, with
the help of the Karnaugh map or algorithms that simplify the resulting tables.
This attack is only feasible if a small FPGA with explicit inputs and outputs is
attacked and a lot of processor power is available.
3.3

Readback Attack

Readback is a feature that is provided for most FPGA families. This feature
allows to read a configuration out of the FPGA for easy debugging. The idea
of the attack is to read the configuration of the FPGA through the JTAG or
programming interface in order to obtain secret information (e.g. keys) [Dip].
The readback functionality can be prevented with security bits provided by the
manufactures.
However, it is conceivable, that an attacker can overcome these countermeasures in FPGA with fault injection. This kind of attack was first introduced
in [BDL97] and it was shown how to break public-key algorithms by exploiting
hardware faults. It seems very likely that these attacks can be easily applied to
FPGAs, since they are not especially targeted to ASICs. If this is in fact feasible, an attacker is able to deactivate security bits and/or the countermeasures,
resulting in the ability to read out the configuration of the FPGA [Kes,Dip].

3.4

Cloning of SRAM FPGAs

In a standard scenario, the configuration data is stored (unprotected) externally in nonvolatile memory (e.g., PROM) and is transmitted to the FPGA at
power-up in order to configure the FPGA. An attacker could easily eavesdrop
on the transmission and get the configuration file. This attack is therefore feasible for large organizations as well as for those with low budgets and modest
sophistication.
3.5

Reverse-Engineering of the Bitstreams

The attacks described so far output the bitstream of the FPGA design. In order
to get the design of proprietary algorithms or the secret keys, one has to reverseengineer the bitstream. The condition to launch the attack is that the attacker
has to be in possession of the (unencrypted) bitstream.
FPGA manufactures claim that the security of the bitstream relies on the
disclosure of the layout of the configuration data. This information will only be
made available if a non-disclosure agreement is signed, which is, from a cryptographic point of view, an extremely insecure situation. This security-by-obscurity
approach was broken at least ten years ago when the CAD software company
NEOCad reverse-engineered a Xilinx FPGA [Sea]1 . Even though a big effort has
to be made to reverse engineer the bitstream, for large organizations it is quite
feasible. In terms of government organizations as attackers, it is also possible
that they will get the information of the design methodology directly from the
vendors or companies that signed NDAs.
3.6

Physical Attack

The aim of a physical attack is to investigate the chip design in order to get
information about proprietary algorithms or to determine the secret keys by
probing points inside the chip. Hence, this attack targets parts of the FPGA,
which are not available through the normal I/O pins. This can potentially be
achieved through visual inspections and by using tools such as optical microscopes and mechanical probes. However, FPGAs are becoming so complex that
only with advanced methods, such as Focused Ion Beam (FIB) systems, one
can launch such an attack. To our knowledge, there are no countermeasures to
protect FPGAs against this form of physical threat. In the following, we will
try to analyze the effort needed to physically attack FPGAs manufactured with
different underlying technologies.
SRAM FPGAs: Unfortunately, there are no publications available that
accomplished a physical attack against SRAM FPGAs. This kind of attack is
only treated very superficially in a few articles, e.g. [Ric98]. In the related area of
1

In the original paper, we stated incorrectly that the Xilinx and Altera bitstream
have been reversed engineered. This information is not correct. However, the design
software was reversed engineered.

SRAM memory, however there has been a lot of effort by academia and industry
to exploit this kind of attack [Gut96,Gut01,AK97,WKM+ 96,Sch98,SA93,KK99].
Due to the similarities in structure of the SRAM memory cell and the internal
structure of the SRAM FPGA, it is most likely that the attacks can be employed
in this setting.
Contrary to common wisdom, the SRAM memory cells do not entirely loose
the contents when power is cut. The reason for these effects are rooted in the
physical properties of semiconductors (see [Gut01] for more details). The physical changes are caused mainly by three effects: electromigration, hot carriers,
and ionic contamination. Most publications agree that device can be altered,
if 1) threshold voltage has changed by 100mV or 2) there is a 10% change in
transconductance, voltage or current.
One can attack SRAM memory cells using the access points provided by the
manufactures. An extreme case of data recovery, was described in [AK97], where
a cryptographic key war recovered without special equipment. IDDQ testing
is one of the widely used methods to analyze SRAM cells and it is based on the
analysis of the current usage [Gut01,WKM+ 96,Sch98]. Another possibilities for
the attack are also to use the scan path that the IC manufacturers insert for test
purposes or techniques like bond pad probing [Gut01].
When it becomes necessary to use access points that are not provided by the
manufacturer, the layers of the chip have to be removed. Mechanical probing
with tungsten wire with a radius of 0, 1 0, 2m is the traditional way to discover the needed information. Focused Ion Beam (FIB) workstations can expose
buried conductors and deposit new probe points [KK99]. Electron-beam tester
(EBT) is another measurement method. EBT measures the energy and amount
of secondary electrons that are reflected.
Resulting from the above discussion of attacks against SRAM memory cells,
it seems likely that a physical attack against SRAM FPGAs can be launched
successfully, assuming that the described techniques can be transfered. However,
the physical attacks are quite costly and having the structure and the size of
state-of-the-art FPGA in mind, the attack will probably only be possible for
large organizations, for example intelligence services.
Antifuse FPGAs: In order to be able to detect the existence or nonexistence of the connection one has to remove layer after layer, or/and use
cross-sectioning. Unfortunately, no details have been published regarding this
type of attack. In [Dip], the author states that a lot of trial-and-error is necessary to find the configuration of one cell and that it is likely that the rest
of the chip will be destroyed, while analyzing one cell. The main problem with
this analysis is that the isolation layer is much smaller than the whole AF cell.
One study estimates that about 800,000 chips with the same configuration are
necessary to explore one configuration file of an Actel A54SX16 chip with 24,000
system gates [Dip]. Another aggravation of the attack is that only about 25 %
of all possible connections in an average design are actually used. In [Ric98] a
practical attack against AF FPGAs was performed and it was possible to alter
one cell in two months at a cost of $1000.

Flash FPGAs: Flash FPGAs can be analyzed by placing the chip in a


vacuum chamber and powering it up. Other possible attacks against flash FPGAs
can be found in the related area of flash memory. The number of write/erase
cycles are limited to 10,000 100,000, because of the accumulation of electrons
in the floating gate causing a gradual rise of the transistors threshold voltage.
This fact increases the programming time and eventually disables the erasing of
the cell [Gut01]. Another less common failure is the programming disturbance
in which unselected erased cells gain charge when adjacent selected cells are
written [ASH+ 93,Gut01]. Furthermore, electron emission causes a net charge
loss [PGP+ 91]. In addition, hot carrier effects build a tunnel between the bands
[HCSL89]. Another phenomenon is overerasing, where an erase cycle is applied
to an already-erased cell leaving the floating gate positively charged [Gut01].
All the described effects change in a more or less extensive way the cell
threshold voltage, gate voltage, or the characteristic of the cell. We remark that
the stated phenomenons apply as well for EEPROM memory and that due to
the structure of the FPGA cell these attacks can be simply adapted to attack
flash/EEPROM FPGAs.
3.7

Side Channel Attacks

Any physical implementation of a cryptographic system might provide a side


channel that leaks unwanted information. Examples for side channels include
in particular: power consumption, timing behavior, and electromagnet radiation. Obviously, FPGA implementations are also vulnerable to these attacks.
In [KJJ99] two practical attacks, Simple Power Analysis (SPA) and Differential
Power Analysis (DPA) were introduced. Since their introduction, there has been
a lot of work improving the original power attacks (see, e.g., relevant articles
in [PG-,KP00,KNP01,KKP02]). There seems to be very little work at the time
of writing addressing the feasibility of actual side channel attacks against FPGAs. However, it seems almost certain that the different side channels can be
exploited in the case of FPGAs as well.

How to Prevent Possible Attacks?

This section shortly summarizes possible countermeasures that can be provided


to minimize the effects of the attacks mentioned in the previous section. Most of
them have to be realized by design changes through the FPGA manufacturers,
but some could be applied during the programming phase of the FPGA.
Preventing the Black Box Attack: The Black Box Attack is not a real
threat nowadays, due to the complexity of the designs and the size of state-of-theart FPGAs. Furthermore, the nature of cryptographic algorithms prevents the
attack as well. Todays stream ciphers output a bit stream, with a period length
of 128 bits (e.g. w7). Block ciphers, like AES, are designed with a minimum
key length of 128 bits. Minimum length in the case of public-key algorithms is
160 bits for elliptic curve cryptosystems and 1024 bits for discrete logarithm and

RSA-based systems. It is widely believed that it is infeasible to perform a brute


force attack and search a space with 280 possibilities. Hence, implementations of
this algorithms can not be attacked with the black box approach.
Preventing the Cloning of SRAM FPGAs: There are many suggestions
to prevent the cloning of SRAM FPGAs, mainly motivated by the desire to
prevent reverse engineering of general, i.e., non-cryptographic, FPGA designs.
One solution would be to check the serial number before executing the design and
delete the circuit if it is not correct. Another solution would be to use dongles to
protect the design (a survey on dongles can be found in [Kea01]). Both solutions
do not provide the necessary security, see [WP03] for more details. A more
realistic solution would be to have the nonvolatile memory and the FPGA in
one chip or to combine both parts by covering them with epoxy. However, it has
to be guaranteed that an attacker is not able to separate the parts.
Encryption of the configuration file is the most effective and practical countermeasure against the cloning of SRAM FPGAs. There are several patents that
propose different encryption scenarios [Jef02,Aus95,Eri99,SW99,Alg] and a good
number of publications, e.g. [YN00,KB00]. The 60RS family from Actel was the
first attempt to have a key stored in the FPGA in order to be able to encrypt
the configuration file. The problem was that every FPGA had the same key on
board.
An approach in a completely different direction would be to power the whole
SRAM FPGA with a battery, which would make transmission of the configuration file after a power loss unnecessary. This solution does not appear practical,
however, because of the power consumption of FPGAs. Hence, a combination
of encryption and battery power provides a possible solution. Xilinx addresses
this with an on-chip 3DES decryption engine in its Virtex II [Xil] (see also
[PWF+ 00]), where the two keys are stored in the battery powered memory.
Preventing the Physical Attack: To prevent physical attacks, one has
to make sure that the retention effects of the cells are as small as possible, so
that an attacker can not detect the status of the cells. Already after storing a
value in a SRAM memory cell for 100500 seconds, the access time and operation voltage will change [vdPK90]. The solution would be to invert the data
stored periodically or to move the data around in memory. Neutralization of the
retention effect can be achieved by applying an opposite current [TCH93] or by
inserting dummy cycles into the circuit [Gut01]. In terms of FPGA application,
it is very costly or even impractical to provide solutions like inverting the bits or
changing the location for the whole configuration file. A possibility could be that
this is done only for the crucial part of the design, like the secret keys. Counter
techniques such as dummy cycles and opposite current approach can be carried
forward to FPGA applications.
Antifuse FPGAs can only be protected against physical attack, by building
a secure environment around them. If an attack was detected every cell should
be programmed in order not to leak any information or the antifuse FPGA has
to be destroyed.

In terms of flash/EEPROM memory cell, one has to consider that the first
write/erase cycles causes a larger shift in the cell threshold [SKM95] and that this
effect will become less noticeable after ten write/erase cycles [HCSL89]. Thus,
one should program the FPGA about 100 times with random data, to avoid these
effect (suggested for flash/EEPROM memory cells in [Gut01]). The phenomenon
of overerasing flash/EEPROM cells can be minimized by first programming all
cells before deleting them.
Preventing the Readback Attack: The readback attack can be prevented
with the security bits set, as provided by the manufactures, see Section 3.3. If
one wants to make sure that an attacker is not able to apply fault injection, the
FPGA has to be embedded into a secure environment, where after detection of
an interference the whole configuration is deleted or the FPGA is destroyed.
Preventing the Side Channel Attack: In recent years, there has been a
lot of work done to prevent side-channel attacks (see, e.g., relevant articles in
[PG-,KP00,KNP01,KKP02]). There are Software countermeasures that refer
primarily to algorithmic changes which are also applicable to implementations in
FPGA. Furthermore, there are Hardware countermeasures that often deal either
with some form of power trace smoothing or with transistor-level changes of the
logic. Neither seem to be easily applicable to FPGAs without support from the
manufacturers. However, some proposals such as duplicated architectures might
work on todays FPGAs.

Open Problems

At this point we would like to provide a list of open questions and problems
regarding the security of FPGAs. If answered, such solutions would allow standalong FPGAs with much higher security assurance than currently available. A
more detailed description to all points can be found in [WP03].

Side channel attacks


Fault injection
Key management for configuration encryption
Secure deletion
Physical attacks

Conclusions

This contribution analyzed possible attack against the use of FPGA in security
applications. For black box attacks, we stated that they are not feasible for
state-of-the-art FPGAs. However, it seems very likely for an attacker to get the
secret information stored in a FPGA, when combining readback attacks with
fault injection. Cloning of SRAM FPGA and reverse engineering depend on
the specifics of the system under attacked, and they will probably involve a lot
of effort, but this does not seem entirely impossible. Physical attacks against
FPGAs are very complex due to the physical properties of the semiconductors

in the case of flash/SRAM/EEPROM FPGAs and the small size of AF cells. It


appears that such attacks are even harder than analogous attacks against ASICs.
Even though FPGA have different internal structures than ASICs with the same
functionality, we believe that side-channel attacks against FPGAs, in particular
power-analysis attacks, will be feasible too.
From the discussion above it may appear that FPGAs are currently out of
question for security applications. We dont think that this the right conclusion,
however. It should be noted that many commercial ASICs with cryptographic
functionality are also vulnerable to attacks similar to the ones discussed here.
A commonly taken approach to prevent these attacks is to put the ASIC in a
secure environment.

References
[AK97]

R.J. Anderson and M.G. Kuhn. Low Cost Attacks on Tamper Resistant
Devices. In 5th International Workshop on Security Protocols, pages 125
136. Springer-Verlag, 1997. LNCS 1361.
[Alg]
Algotronix Ltd. Method and Apparatus for Secure Configuration of a Field
Programmable Gate Array. PCT Patent Application PCT/GB00/04988.
[ASH+ 93] Seiichi Aritome, Riichiro Shirota, Gertjan Hemink, Tetsup Endoh, and Fujio Masuoka. Reliability Issues of Flash Memory Cells. Proceedings of the
IEEE, 81(5):776788, May 1993.
[Aus95]
K. Austin. Data Security Arrangements for Semicondutor Programmable
Devices. United States Patent, No. 5388157, 1995.
[BDL97]
D. Boneh, R. A. DeMillo, and R. J. Lipton. On the Importance of Checking
Cryptographic Protocols for Faults. In EUROCRYPT 97, pages 3751.
Springer-Verlag, 1997. LNCS 1233.
[Dip]
B. Dipert.
Cunning circuits confound crooks.
http://www.einsite.net/ednmag/contents/images/21df2.pdf.
[Eri99]
C. R. Erickson. Configuration Stream Encryption. United States Patent,
No. 5970142, 1999.
[EYCP01] A. Elbirt, W. Yip, B. Chetwynd, and C. Paar. An FPGA-based performance evaluation of the AES block cipher candidate algorithm finalists.
IEEE Transactions on VLSI Design, 9(4):545557, August 2001.
[Gut96]
P. Gutmann. Secure Deletion of Data from Magnetic and Solid-State Memory. In Sixth USENIX Security Symposium, pages 7790, July 22-25, 1996.
[Gut01]
P. Gutmann. Data Remanence in Semiconductor Devices. In 10th USENIX
Security Symposium, pages 3954, August 1317, 2001.
[HCSL89] Sameer Haddad, Chi Chang, Balaji Swaminathan, and Jih Lien. Degradations due to hole trapping in flash memory cells. IEEE Electron Device
Letters, 10(3):117119, March 1989.
[Jef02]
G. P. Jeffrey. Field programmable gate arrays. United States Patent, No.
6356637, 2002.
[KB00]
S. H. Kelem and J. L. Burnham. System and Method for PLD Bitstram
Encryption. United States Patent, No. 6118868, 2000.
[Kea01]
T. Kean. Secure Configuration of Field Programmable Gate Arrays. In
FPL 2001, pages 142151. Springer-Verlag, 2001. LNCS 2147.
[Kes]
D. Kessner.
Copy Protection for SRAM based FPGA Designs.
http://www.free-ip.com/copyprotection.html.

[KJJ99]
[KK99]
[KKP02]

[KNP01]

[KP00]

P. Kocher, J. Jaffe, and B. Jun. Differential Power Analysis. In CRYPTO


99, pages 388397. Springer-Verlag, 1999. LNCS 1666.
O. Kommerling and M.G. Kuhn. Design Principles for Tamper-Resistant
Smartcard Processors. In Smartcard 99, pages 920, May 1999.
B. S. Kaliski, Jr., C
. K. Koc, and C. Paar, editors. Workshop on Cryptographic Hardware and Embedded Systems CHES 2002, Berlin, Germany,
August 13-15, 2002. Springer-Verlag. LNCS 2523.
C
. K. Koc, D. Naccache, and C. Paar, editors. Workshop on Cryptographic
Hardware and Embedded Systems CHES 2001, Berlin, Germany, May
13-16, 2001. Springer-Verlag. LNCS 2162.
C
. K. Koc and C. Paar, editors. Workshop on Cryptographic Hardware and
Embedded Systems CHES 2000, Berlin, Germany, August 17-18, 2000.
Springer-Verlag. LNCS 1965.

[PG-]
[PGP+ 91] C. Papadas, G. Ghibaudo, G. Pananakakis, C. Riva, P. Ghezzi, C. Gounelle,
and P. Mortini. Retention characteristics of single-poly EEPROM cells. In
European Symposium on Reliability of Electron Devices, Failure Physics
and Analysis, page 517, October 1991.
[PWF+ 00] R. C. Pang, J. Wong, S. O. Frake, J. W. Sowards, V. M. Kondapalli, F. E.
Goetting, S. M. Trimberger, and K. K. Rao. Nonvolatile/battery-backed
key in PLD. United States Patent, No. 6366117, Nov. 28 2000.
[Ric98]
G. Richard. Digital Signature Technology Aids IP Protection. In EETimes
- News, 1998. http://www.eetimes.com/news/98/1000news/digital.html.
[SA93]
J. Soden and R.E. Anderson. IC failure analysis: techniques and tools for
quality and reliability improvement. Proceedings of the IEEE, 81(5):703
715, May 1993.
[Sch98]
D.K. Schroder. Semiconducor Material and Device Characterization. John
Wiley and Sons, 1998.
[Sea]
G.
Seamann.
FPGA
Bitstreams
and
Open
Designs.
http://www.opencollector.org/.
[SKM95] K.T. San, C. Kaya, and T.P. Ma. Effects of erase source bias on
Flash EPROM device reliability. IEEE Transactions on Electron Devices,
42(1):150159, January 1995.
[SW99]
C. Sung and B. I. Wang. Method and Apparatus for Securing Programming
Data of Programmable Logic Device. United States Patent, Patent Number
5970142, June 22 1999.
[TCH93] Jiang Tao, Nathan Cheung, and Chenming Ho. Metal Electromigration
Damage Healing Under Bidirectional Current Stress. IEEE Transactions
on Elecron Devices, 14(12):554556, December 1993.
[vdPK90] J. van der Pol and J. Koomen. Relation between the hot carrier lifetime
of transistors and CMOS SRAM products. In IRPS 1990, page 178, 1990.
[WKM+ 96] T.W. Williams, R. Kapur, M.R. Mercer, R.H. Dennard, and W. Maly.
IDDQ Testing for High Performance CMOS - The Next Ten Years. In
ED&TC96, pages 578583, 1996.
T. Wollinger and C. Paar.
How Secure Are FPGAs in Crypto[WP03]
graphic Applications? (Long Version). Report 2003/119, IACR, 2003.
http://eprint.iacr.org/.
Xilinx Inc. Using Bitstream Encryption. Handbook of the Virtex II Plat[Xil]
form. http://www.xilinx.com.

[YN00]

Kun-Wah Yip and Tung-Sang Ng. Partial-Encryption Technique for Intellectual Property Protection of FPGA-based Products. IEEE Transactions
on Consumer Electronics, 46(1):183190, 2000.

You might also like