Designers can protect IP with FPGAs and bitstream encryption

April 1, 2007
Security continues to be of utmost importance in military, aerospace, and government applications.

By Amr El-Ashmawi

Security continues to be of utmost importance in military, aerospace, and government applications. This is particularly true with emerging new technologies and an obsessive demand to safeguard intellectual property (IP). Up until now, the technology for implementing conventional security has been design-cumbersome and costly. However, newer trends are toward greater commercial-off-the-shelf (COTS) security, ease-of-design, and lower cost, thanks to advances in field-programmable-gate-array (FPGA) design.

Take for example FPGAs that conform to the Federal Information Processing Standards (FIPS)-197 Standard. They support configuration bitstream encryption using the 128-bit Advanced Encryption Standard (AES) or a nonvolatile key. AES is the most advanced encryption algorithm available today. A user-defined AES key can be programmed into the 128-bit nonvolatile key stored in an FPGA device.

In these FPGA-based designs, it is prudent for system engineers to pursue secure approaches to protect the IP against copying, reverse engineering, and tampering. Plus, selecting an FPGA with these features also prevents intruders from readback of any configuration file, whether encrypted or unencrypted. Thus, the designer adds another layer of security.

FPGAs based on static random-access memory (SRAM) are taking on more important roles in military and aerospace system designs based on their relatively high complexities, low cost, and ease of design. These devices are the core of a system containing valuable IP. Thus, protecting FPGA-based designs becomes critical. Security solutions featuring AES-based FPGAs come at a time when the U.S. Department of Defense is firmly setting policies, procedures, and guidelines to implement antitampering (AT) techniques in weapon-systems acquisitions programs.

Billions of dollars are being invested in advanced military and aerospace systems, and they are becoming increasingly vulnerable to exploitation. As a result, military advantages are weakened, expected system combat life is shortened, and technological competitiveness is eroded. Moreover, the risk of exploitation continues to mount due to weapon-system exports, diverse battlefields, and uncertainties about future allies. AES literally holds the key to protecting this valuable military and aerospace IP through encryption.

Applications

Major programs such as the U.S. Army Future Combat Systems (FCS), F-35 Joint Strike Fighter (JSF), and the Joint Tactical Radio System (JTRS) are pushing technological capabilities on all fronts to their limits. The programmable logic and FPGAs used in the electronics of these complex systems are very reliable. These devices provide designers the necessary security power to safeguard highly sought-after IP in a variety of military and aerospace applications. Those include missiles and munitions, electronic warfare, secure communications, and remote sensors and surveillance.

Missiles and munitions systems require limited space and long shelf life. With FPGA nonvolatility encryption-key storage, there is no need to continuously monitor battery life to meet the long shelf life. Nonvolatile encryption-key storage is also ideal in electronic warfare systems to meet limited space and low maintenance requirements. Nonvolatile storage is particularly important in these systems because there is reluctance to use chemical content like batteries.

Also, with the need to maintain secure communications on the battle field, the configuration bitstream encryption of FPGA devices provides an additional security level above and beyond current methods used in Type 1 (classified or controlled cryptographic applications endorsed by the NSA for securing classified and sensitive U.S. Government information, when appropriately keyed) and Type 3 (applications used with sensitive, but unclassified information, such as DES, Triple DES, and AES. AES might also be usable in NSA-certified Type 1) applications. For remote sensors and surveillance systems, these FPGAs protect critical IP in applications exposed to hostile environments throughout the battlefield.

Antitampering

Before launching into a specific design, savvy system-engineering management should perform an analysis to determine whether antitampering is required in a given design. Several key steps and considerations are involved in this decision-making process.

Initially, system designers must understand if the targeted technology and its associated applications and hardware are critical. Next, they should identify environmental and export threats, technology vulnerability, types of internal and external attacks and their timelines. They must also understand the influence on overall cost, on national security, and from loss of capabilities and competitive advantages.

Once these steps are well defined and characterized, the next step is to understand the value that FPGA design security offers and the requirements for FIPS-197-compliant 128-bit AES nonvolatile keys. Subsequent and final steps involve FPGA encryption implementation with development and testing followed by system verification and validation. At this point, designers have reached final antitampering implementation and a design can begin in earnest.

The right security

There are two important factors military and aerospace system engineers must consider when starting a design. One is selecting the correct encryption algorithm; the second is choice of key storage. AES supports key sizes of 128, 192, and 256 bits, and replaces the Data Encryption Standard (DES), which has 56-bit key and 64-bit data block sizes. Larger key sizes, like AES, equate to increased security, plus AES encrypts data faster than Triple-DES, which is a DES enhancement. Studies show that if a machine can be built that discovers a DES key in seconds, then it would take that same machine about 149 trillion years to discover a 128-bit AES key.

Encryption converts electronic data to an unintelligible form called ciphertext; decrypting the ciphertext converts data back into its original form or plain text. AES, developed by two Belgian cryptographers, Joan Daemon and Vincent Rijmen, is also known as the Rijndael algorithm.

The AES algorithm is a symmetric block cipher that encrypts/enciphers and decrypts/deciphers electronic data. Symmetric-key means the key used for encryption is the same as the one used for decryption. Block cipher means the data is processed in blocks. Symmetric-key block-cipher encryption algorithms are widely used in many industries to provide data security because of its high security protection and efficiency, ease of implementation, and fast data-processing speed.

The choice of key storage is the second most important design consideration. A key is stored in either volatile or nonvolatile storage, depending on the chip vendor. Once power is off for volatile storage, the key is lost unless there is an external battery connected to the chip as a backup power supply. On the other hand, nonvolatile key storage hands the designer greater flexibility.

Poor reliability is the biggest problem batteries pose for volatile storage. Battery life is affected by temperature and moisture levels. When a battery dies, the key is lost. As a result, the device can no longer be configured, and the equipment must be returned to the vendor for repairs and key reloading. Also, battery backup cost is higher because it requires additional components, more board space, additional engineering work, and becomes more difficult to manufacture.

Batteries usually cannot stand the high-temperature reflow process, and have to be soldered onto the board afterward, which involves an added manufacturing step. Volatile key storage also requires a key to be programmed into the device after it is soldered on the board.

As for nonvolatile storage, because of its one-time programming, the key is tamper-proof. This is not possible with volatile storage. The reason is the battery can be removed and the FPGA can be configured with a regular encrypted configuration file.

Design-in top security

The three-step secure configuration design flow includes Quartus II development software and FPGA flows. The first step is programming the security key into the FPGA, which requires two 128-bit keys (Key 1 and Key 2) to generate a key programming file. That file with Keys 1 and 2 information is then loaded into the FPGA.

Then, the AES encryption engine built into the FPGA generates the real key used to decrypt the configuration data in step three. The real key, created by encrypting Key 1 and Key 2, is then processed by a proprietary function before being stored in the 128-bit nonvolatile key storage.

In step two, the configuration file is encrypted and stored in external memory. Quartus II software requires the two 128-bit keys (Key 1 and Key 2) to encrypt the configuration file. The AES encryption engine generates the real key by encrypting Key 1 with Key 2. The real key is used to encrypt the configuration file, which is then loaded into external memory, such as a configuration or Flash device.

In the third step, the FPGA is configured. At system power-up, the external memory device sends the encrypted configuration file to the FPGA. The 128-bit nonvolatile key in the FPGA is processed by the inverse of the proprietary function to generate the real key. The AES decryption engine then uses the real key to decrypt the configuration file and configure itself.

Security break-ins

As part of the design process, it is important for system designers to understand different types of security breaches. These are categorized as copying, reverse engineering, and tampering. Copying includes black-box attack, readback attack, configuration-bitstream probing, and programming-state probing. The intruder can also reverse engineer the configuration file or the FPGA itself. Tampering is performed by reprogramming the FPGA.

Copying is making identical copies of a design without understanding how it works. Copying can be done by either reading the design out of the memory device or capturing the configuration file when it is sent from the memory device to the FPGA at power up. A stolen design can then be used to configure other FPGAs. This approach is a primary form of IP theft and can cause significant revenue loss to the designer.

Reverse engineering involves analyzing the configuration file to recreate the original design at the register transfer level (RTL) or in schematic form. A recreated design can then be modified to gain a competitive edge. This is a more complex form of IP theft than copying and usually requires significant technical expertise. It is also time- and resource-intensive, and sometimes, requires more work than creating a design from scratch.

Tampering is modifying the design stored in the device or replacing it with a different design. A tampered device may contain harmful design code capable of causing a system to malfunction or steal sensitive data. This type of design security breach is a particular concern in military and aerospace applications.

Also, it should be noted that most nonvolatile reprogrammable FPGAs have a readback feature that permits configuration data to be read back for debugging purposes. There are usually security bits the designer can set for the device. When security bits are not set, readback is allowed, and it is straightforward to obtain configuration data. However when security bits are set, readback is disabled. One way to conduct a readback attack when security bits are set is to detect where security bits are located in the FPGA and deactivate them to enable readback.

Defense against intrusions

Some FPGAs make it extremely difficult and virtually impossible for attackers to obtain IP from highly secured military and aerospace designs. In particular, there is considerable difficulty in detecting and deactivating security bits, thus providing designers high security against copying. Security defenses are set up in the following way.

Polyfuses storing the security keys are hidden under layers of metal among hundreds of other polyfuses. It is nearly impossible to determine the functionality of a particular fuse by simple visual inspection. The programming status of the polyfuses used for other functions can be different from device to device.

This randomness makes it more difficult to identify which fuses store the security key. Also, even if the polyfuses storing the security key are identified, the real key used for decryption is not revealed because it is processed by the proprietary function prior to storage. Without knowing the real key, the design cannot be decrypted.

These FPGAs are thus secure against readback attacks because they do not support configuration-file readback, preventing attempts to read back the configuration file after it is decrypted within the FPGA. Further, these designs cannot be copied by programming the security key into another FPGA and configuring it with an encrypted configuration file. Two 128-bit keys are required to program the security key into the FPGA. Since AES is used to generate the real key, it is virtually impossible to generate Key 1 and Key 2 from the security key.

Reverse engineering a design from the configuration file is difficult and time-consuming. The FPGA configuration file contains millions of bits and the configuration-file formats are proprietary and confidential. To reverse engineer a design requires reverse engineering of the FPGA or the design software to reveal the mapping from the configuration file to the device resources.

Reverse engineering FPGAs is more difficult than reverse engineering ASICs. These particular FPGAs are manufactured on an advanced 90-nanometer process technology. Standard tools and knowledge are not readily available to reverse engineer these cutting edge FPGAs. It can take a significant amount of time and resources to reverse engineer just one FPGA logic block. With configuration bitstream encryption, reverse engineering is made even more difficult. Finding the security key to decrypt the configuration file is as difficult as copying.

Polyfuses used to store the security key are nonvolatile and one-time programmable to guard against tampering. No battery is needed because the design includes a nonvolatile key. After the FPGA is programmed with the key, it can only be configured with configuration files encrypted with the same key. Attempts to configure an FPGA with an unencrypted configuration file or a configuration file encrypted with the wrong key result in configuration failure.

Aside from the FPGA security solutions discussed above, other design options available to military and aerospace designers include SRAM-based FPGAs limited to Triple-DES encryption, flash-based FPGAs, and antifuse-based FPGAs.

Kinds of attacks

An attack via electron-emissions detection involves removing the device’s package to expose the die, and then placing the device in a vacuum chamber. The attacker then uses a transmission-electron microscope (TEM) to detect and display emissions. There’s also the power attack, which involves measuring an FPGA’s power consumption to determine which function the device is performing.

In a readback attack on Flash-based FPGAs, the amount of effort required depends on how well security bits are protected in the device. Moreover, probing each floating gate of a Flash-based FPGA is time consuming. Also, reverse engineering a Flash FPGA configuration file is not easy because the configuration file must first be obtained. Tampering with a Flash-based FPGA is easy because the device is reprogrammable. A tamper-proof mechanism therefore needs to be used if tampering is a concern.

Programming-state probing is also used for attacking antifuse-based FPGAs. Techniques include focused-ion-beam (FIB) technology and scanning-electron microscope (SEM). Analyzing the programming state of an antifuse-based FPGA is extremely time consuming due to the millions of antifuse links and the small percentage programmed.

In an era of ever-increasing security concerns, SRAM-based FPGAs combined with bitstream encryption offer designers of military systems critical advantages. In addition to high density, high performance, low development risk, and fast time-to-market benefits over other implementations, they also deliver a secure approach for protecting proprietary designs and IP.

Amr El-Ashmawi is senior manager at the Altera Corp. Military and Aerospace Business Unit in San Jose, Calif.

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!