How Secure Are Fpgas in Cryptographic Applications?: (Wollinger, Cpaar) @crypto - Rub.De
How Secure Are Fpgas in Cryptographic Applications?: (Wollinger, Cpaar) @crypto - Rub.De
Applications? ?
Thomas Wollinger and Christof Paar
Chair for Communication Security (COSY)
Horst G
ortz Institute for IT Security
Ruhr-Universit
at Bochum, Germany
{wollinger, cpaar}@crypto.rub.de
Introduction
In the original paper, we stated incorrectly that the Xilinx and Altera bitstream
have been reversed engineered. This information is not correct. However, the design software was reversed engineered. This research was partially sponsored by the
German Federal Office for Information Security (BSI).
e.g., relevant articles in [PG-,KP00,KNP01,KKP02]), often focusing on highperformance implementations. At the same time, however, very little work has
been done dealing with the system and physical aspects of FPGAs as they pertain
to cryptographic applications. It should be noted that the main threat to a
cryptographic scheme in the real world is not the cryptanalysis of the actual
algorithm, but rather the exploration of weaknesses of the implementation. Given
this fact, we hope that the contribution at hand is of interest to readers in
academia, industry and government sectors.
In this paper well start in Section 2 with a list of the advantages of FPGAs
in cryptographic applications from a systems perspective. Then, we highlight
important questions pertaining to the security of FPGAs when used for crypto
algorithms in Section 3. A major part of this contribution is a state-of-the-art
perspective of security issues with respect to FPGAs, by illuminating this problem from different viewpoints and by trying to transfer problems and solutions
from other hardware platforms to FPGAs (Section 4). In Section 5, we provide
a list of open problems. Finally, we end this contribution with some conclusions.
We would like to stress that this contribution is not based on any practical experiments, but on a careful analysis of available publications in the literature
and on our experience with implementing crypto algorithms.
Algorithm Agility
Algorithm Upload
Architecture Efficiency
Resource Efficiency
Algorithm Modification
Throughput
Cost Efficiency
3.1
Objectives of an Attacker
The classical method to reverse engineer a chip is the so called Black Box attack.
The attacker inputs all possible combinations, while saving the corresponding
outputs. The intruder is then able to extract the inner logic of the FPGA, with
the help of the Karnaugh map or algorithms that simplify the resulting tables.
This attack is only feasible if a small FPGA with explicit inputs and outputs is
attacked and a lot of processor power is available.
3.3
Readback Attack
Readback is a feature that is provided for most FPGA families. This feature
allows to read a configuration out of the FPGA for easy debugging. The idea
of the attack is to read the configuration of the FPGA through the JTAG or
programming interface in order to obtain secret information (e.g. keys) [Dip].
The readback functionality can be prevented with security bits provided by the
manufactures.
However, it is conceivable, that an attacker can overcome these countermeasures in FPGA with fault injection. This kind of attack was first introduced
in [BDL97] and it was shown how to break public-key algorithms by exploiting
hardware faults. It seems very likely that these attacks can be easily applied to
FPGAs, since they are not especially targeted to ASICs. If this is in fact feasible, an attacker is able to deactivate security bits and/or the countermeasures,
resulting in the ability to read out the configuration of the FPGA [Kes,Dip].
3.4
In a standard scenario, the configuration data is stored (unprotected) externally in nonvolatile memory (e.g., PROM) and is transmitted to the FPGA at
power-up in order to configure the FPGA. An attacker could easily eavesdrop
on the transmission and get the configuration file. This attack is therefore feasible for large organizations as well as for those with low budgets and modest
sophistication.
3.5
The attacks described so far output the bitstream of the FPGA design. In order
to get the design of proprietary algorithms or the secret keys, one has to reverseengineer the bitstream. The condition to launch the attack is that the attacker
has to be in possession of the (unencrypted) bitstream.
FPGA manufactures claim that the security of the bitstream relies on the
disclosure of the layout of the configuration data. This information will only be
made available if a non-disclosure agreement is signed, which is, from a cryptographic point of view, an extremely insecure situation. This security-by-obscurity
approach was broken at least ten years ago when the CAD software company
NEOCad reverse-engineered a Xilinx FPGA [Sea]1 . Even though a big effort has
to be made to reverse engineer the bitstream, for large organizations it is quite
feasible. In terms of government organizations as attackers, it is also possible
that they will get the information of the design methodology directly from the
vendors or companies that signed NDAs.
3.6
Physical Attack
The aim of a physical attack is to investigate the chip design in order to get
information about proprietary algorithms or to determine the secret keys by
probing points inside the chip. Hence, this attack targets parts of the FPGA,
which are not available through the normal I/O pins. This can potentially be
achieved through visual inspections and by using tools such as optical microscopes and mechanical probes. However, FPGAs are becoming so complex that
only with advanced methods, such as Focused Ion Beam (FIB) systems, one
can launch such an attack. To our knowledge, there are no countermeasures to
protect FPGAs against this form of physical threat. In the following, we will
try to analyze the effort needed to physically attack FPGAs manufactured with
different underlying technologies.
SRAM FPGAs: Unfortunately, there are no publications available that
accomplished a physical attack against SRAM FPGAs. This kind of attack is
only treated very superficially in a few articles, e.g. [Ric98]. In the related area of
1
In the original paper, we stated incorrectly that the Xilinx and Altera bitstream
have been reversed engineered. This information is not correct. However, the design
software was reversed engineered.
SRAM memory, however there has been a lot of effort by academia and industry
to exploit this kind of attack [Gut96,Gut01,AK97,WKM+ 96,Sch98,SA93,KK99].
Due to the similarities in structure of the SRAM memory cell and the internal
structure of the SRAM FPGA, it is most likely that the attacks can be employed
in this setting.
Contrary to common wisdom, the SRAM memory cells do not entirely loose
the contents when power is cut. The reason for these effects are rooted in the
physical properties of semiconductors (see [Gut01] for more details). The physical changes are caused mainly by three effects: electromigration, hot carriers,
and ionic contamination. Most publications agree that device can be altered,
if 1) threshold voltage has changed by 100mV or 2) there is a 10% change in
transconductance, voltage or current.
One can attack SRAM memory cells using the access points provided by the
manufactures. An extreme case of data recovery, was described in [AK97], where
a cryptographic key war recovered without special equipment. IDDQ testing
is one of the widely used methods to analyze SRAM cells and it is based on the
analysis of the current usage [Gut01,WKM+ 96,Sch98]. Another possibilities for
the attack are also to use the scan path that the IC manufacturers insert for test
purposes or techniques like bond pad probing [Gut01].
When it becomes necessary to use access points that are not provided by the
manufacturer, the layers of the chip have to be removed. Mechanical probing
with tungsten wire with a radius of 0, 1 0, 2m is the traditional way to discover the needed information. Focused Ion Beam (FIB) workstations can expose
buried conductors and deposit new probe points [KK99]. Electron-beam tester
(EBT) is another measurement method. EBT measures the energy and amount
of secondary electrons that are reflected.
Resulting from the above discussion of attacks against SRAM memory cells,
it seems likely that a physical attack against SRAM FPGAs can be launched
successfully, assuming that the described techniques can be transfered. However,
the physical attacks are quite costly and having the structure and the size of
state-of-the-art FPGA in mind, the attack will probably only be possible for
large organizations, for example intelligence services.
Antifuse FPGAs: In order to be able to detect the existence or nonexistence of the connection one has to remove layer after layer, or/and use
cross-sectioning. Unfortunately, no details have been published regarding this
type of attack. In [Dip], the author states that a lot of trial-and-error is necessary to find the configuration of one cell and that it is likely that the rest
of the chip will be destroyed, while analyzing one cell. The main problem with
this analysis is that the isolation layer is much smaller than the whole AF cell.
One study estimates that about 800,000 chips with the same configuration are
necessary to explore one configuration file of an Actel A54SX16 chip with 24,000
system gates [Dip]. Another aggravation of the attack is that only about 25 %
of all possible connections in an average design are actually used. In [Ric98] a
practical attack against AF FPGAs was performed and it was possible to alter
one cell in two months at a cost of $1000.
In terms of flash/EEPROM memory cell, one has to consider that the first
write/erase cycles causes a larger shift in the cell threshold [SKM95] and that this
effect will become less noticeable after ten write/erase cycles [HCSL89]. Thus,
one should program the FPGA about 100 times with random data, to avoid these
effect (suggested for flash/EEPROM memory cells in [Gut01]). The phenomenon
of overerasing flash/EEPROM cells can be minimized by first programming all
cells before deleting them.
Preventing the Readback Attack: The readback attack can be prevented
with the security bits set, as provided by the manufactures, see Section 3.3. If
one wants to make sure that an attacker is not able to apply fault injection, the
FPGA has to be embedded into a secure environment, where after detection of
an interference the whole configuration is deleted or the FPGA is destroyed.
Preventing the Side Channel Attack: In recent years, there has been a
lot of work done to prevent side-channel attacks (see, e.g., relevant articles in
[PG-,KP00,KNP01,KKP02]). There are Software countermeasures that refer
primarily to algorithmic changes which are also applicable to implementations in
FPGA. Furthermore, there are Hardware countermeasures that often deal either
with some form of power trace smoothing or with transistor-level changes of the
logic. Neither seem to be easily applicable to FPGAs without support from the
manufacturers. However, some proposals such as duplicated architectures might
work on todays FPGAs.
Open Problems
At this point we would like to provide a list of open questions and problems
regarding the security of FPGAs. If answered, such solutions would allow standalong FPGAs with much higher security assurance than currently available. A
more detailed description to all points can be found in [WP03].
Conclusions
This contribution analyzed possible attack against the use of FPGA in security
applications. For black box attacks, we stated that they are not feasible for
state-of-the-art FPGAs. However, it seems very likely for an attacker to get the
secret information stored in a FPGA, when combining readback attacks with
fault injection. Cloning of SRAM FPGA and reverse engineering depend on
the specifics of the system under attacked, and they will probably involve a lot
of effort, but this does not seem entirely impossible. Physical attacks against
FPGAs are very complex due to the physical properties of the semiconductors
References
[AK97]
R.J. Anderson and M.G. Kuhn. Low Cost Attacks on Tamper Resistant
Devices. In 5th International Workshop on Security Protocols, pages 125
136. Springer-Verlag, 1997. LNCS 1361.
[Alg]
Algotronix Ltd. Method and Apparatus for Secure Configuration of a Field
Programmable Gate Array. PCT Patent Application PCT/GB00/04988.
[ASH+ 93] Seiichi Aritome, Riichiro Shirota, Gertjan Hemink, Tetsup Endoh, and Fujio Masuoka. Reliability Issues of Flash Memory Cells. Proceedings of the
IEEE, 81(5):776788, May 1993.
[Aus95]
K. Austin. Data Security Arrangements for Semicondutor Programmable
Devices. United States Patent, No. 5388157, 1995.
[BDL97]
D. Boneh, R. A. DeMillo, and R. J. Lipton. On the Importance of Checking
Cryptographic Protocols for Faults. In EUROCRYPT 97, pages 3751.
Springer-Verlag, 1997. LNCS 1233.
[Dip]
B. Dipert.
Cunning circuits confound crooks.
http://www.einsite.net/ednmag/contents/images/21df2.pdf.
[Eri99]
C. R. Erickson. Configuration Stream Encryption. United States Patent,
No. 5970142, 1999.
[EYCP01] A. Elbirt, W. Yip, B. Chetwynd, and C. Paar. An FPGA-based performance evaluation of the AES block cipher candidate algorithm finalists.
IEEE Transactions on VLSI Design, 9(4):545557, August 2001.
[Gut96]
P. Gutmann. Secure Deletion of Data from Magnetic and Solid-State Memory. In Sixth USENIX Security Symposium, pages 7790, July 22-25, 1996.
[Gut01]
P. Gutmann. Data Remanence in Semiconductor Devices. In 10th USENIX
Security Symposium, pages 3954, August 1317, 2001.
[HCSL89] Sameer Haddad, Chi Chang, Balaji Swaminathan, and Jih Lien. Degradations due to hole trapping in flash memory cells. IEEE Electron Device
Letters, 10(3):117119, March 1989.
[Jef02]
G. P. Jeffrey. Field programmable gate arrays. United States Patent, No.
6356637, 2002.
[KB00]
S. H. Kelem and J. L. Burnham. System and Method for PLD Bitstram
Encryption. United States Patent, No. 6118868, 2000.
[Kea01]
T. Kean. Secure Configuration of Field Programmable Gate Arrays. In
FPL 2001, pages 142151. Springer-Verlag, 2001. LNCS 2147.
[Kes]
D. Kessner.
Copy Protection for SRAM based FPGA Designs.
http://www.free-ip.com/copyprotection.html.
[KJJ99]
[KK99]
[KKP02]
[KNP01]
[KP00]
[PG-]
[PGP+ 91] C. Papadas, G. Ghibaudo, G. Pananakakis, C. Riva, P. Ghezzi, C. Gounelle,
and P. Mortini. Retention characteristics of single-poly EEPROM cells. In
European Symposium on Reliability of Electron Devices, Failure Physics
and Analysis, page 517, October 1991.
[PWF+ 00] R. C. Pang, J. Wong, S. O. Frake, J. W. Sowards, V. M. Kondapalli, F. E.
Goetting, S. M. Trimberger, and K. K. Rao. Nonvolatile/battery-backed
key in PLD. United States Patent, No. 6366117, Nov. 28 2000.
[Ric98]
G. Richard. Digital Signature Technology Aids IP Protection. In EETimes
- News, 1998. http://www.eetimes.com/news/98/1000news/digital.html.
[SA93]
J. Soden and R.E. Anderson. IC failure analysis: techniques and tools for
quality and reliability improvement. Proceedings of the IEEE, 81(5):703
715, May 1993.
[Sch98]
D.K. Schroder. Semiconducor Material and Device Characterization. John
Wiley and Sons, 1998.
[Sea]
G.
Seamann.
FPGA
Bitstreams
and
Open
Designs.
http://www.opencollector.org/.
[SKM95] K.T. San, C. Kaya, and T.P. Ma. Effects of erase source bias on
Flash EPROM device reliability. IEEE Transactions on Electron Devices,
42(1):150159, January 1995.
[SW99]
C. Sung and B. I. Wang. Method and Apparatus for Securing Programming
Data of Programmable Logic Device. United States Patent, Patent Number
5970142, June 22 1999.
[TCH93] Jiang Tao, Nathan Cheung, and Chenming Ho. Metal Electromigration
Damage Healing Under Bidirectional Current Stress. IEEE Transactions
on Elecron Devices, 14(12):554556, December 1993.
[vdPK90] J. van der Pol and J. Koomen. Relation between the hot carrier lifetime
of transistors and CMOS SRAM products. In IRPS 1990, page 178, 1990.
[WKM+ 96] T.W. Williams, R. Kapur, M.R. Mercer, R.H. Dennard, and W. Maly.
IDDQ Testing for High Performance CMOS - The Next Ten Years. In
ED&TC96, pages 578583, 1996.
T. Wollinger and C. Paar.
How Secure Are FPGAs in Crypto[WP03]
graphic Applications? (Long Version). Report 2003/119, IACR, 2003.
http://eprint.iacr.org/.
Xilinx Inc. Using Bitstream Encryption. Handbook of the Virtex II Plat[Xil]
form. http://www.xilinx.com.
[YN00]
Kun-Wah Yip and Tung-Sang Ng. Partial-Encryption Technique for Intellectual Property Protection of FPGA-based Products. IEEE Transactions
on Consumer Electronics, 46(1):183190, 2000.