IC DESIGN FOR AGE-RELATED TIMING DEGRADATION USING MACHINE LEARNING

Loading...
Thumbnail Image

Publication or External Link

Date

Advisor

Srivastava, Ankur

Citation

Abstract

Timing constraints are a major factor in modern IC design.In each clock cycle, data signals must leave one register, travel through the combinational logic of the design, and arrive at another register within a narrow window to satisfy both setup and hold time constraints. Therefore, static timing analysis (STA) is a major step in the design process. Designs often have very little margin in their timing, and many design revisions may be needed to ensure all paths' delays are within acceptable ranges. However, traditional STA considers the timing only of freshly manufactured circuits. As circuits age, effects such as negative bias temperature instability (NBTI) and hot carrier injection alter the timing properties of the device. For devices with a long expected service time, it may not be possible to neglect these effects when considering device timing constraints. In this dissertation, we investigate the design-level ramifications of NBTI and use machine learning (ML) to predict the end-of-life timing information for the design. We then explore design techniques that use this model to improve resilience against timing degradation.

We begin our work by evaluating the degree of NBTI degradation over the lifetime of an integrated circuit (IC).From our simulation data, we observe that the increased delay due to NBTI is significant compared to the total path delays. The traditional approach to protecting the design from errors caused by this additional delay is guardbanding, which anticipates a fixed amount of degradation when setting timing constraints for the designer. This offsets NBTI by artificially reducing the delay allowed along timing paths, effectively reducing the clock period available to the designer. However, NBTI does not age the design uniformly, and guardbanding is overly pessimistic since it assumes the worst-case degradation for all paths. To remove some pessimism and increase the delay budget available to designers, we propose a machine learning technique to predict the effects of NBTI in a design. While existing approaches to modeling NBTI require circuit simulations for each gate in the design, our technique provides designers with end-of-life path delay data at a speed that can be integrated into design automation tools. Once path-specific NBTI degradation data is available to design tools, new algorithms can be developed which are less pessimistic than guardbanding, which assumes worst-case degradation throughout the design. The remainder of this dissertation is devoted to exploring applications of our machine learning technique. First, we investigate the feasibility of planned obsolescence using NBTI as a failure mechanism. Then, we develop an NBTI-aware placement tool by integrating our machine learning technique into an open-source design tool.

Planned obsolescence refers to techniques used by companies to reduce the performance or cause the failure of their products after a certain period of time has passed, usually after new products have been released.The device failures encourage previous customers to purchase replacements, increasing the demand and sales of new products. Often, planned obsolescence is implemented in software updates that intentionally slow performance or decrease battery life. This gives the company direct control over when the degradation begins and even which devices are affected. However, this approach is potentially detectable to consumers, who may notice a sudden decrease in performance and investigate the cause. We describe another approach to planned obsolescence which would allow companies with an obsolescence schedule at design time to introduce performance penalties in a more natural way. Inserting delay buffers into the netlist at design time counteracts the guardband in the circuit timing, resulting in timing violations caused by natural age-related degradation. This approach to planned obsolescence is more difficult to detect, since it is embedded into the original netlist and follows the same trends as natural aging.

Many design algorithms for various stages of the IC design process utilize timing data from STA.However, STA timing data represents circuits when they are freshly manufactured, which is when they are furthest from failure. We explore the use of end-of-life timing in design algorithms, specifically for use in timing-driven global placement. Global placement is a vital step in the design process which allocates standard cells and large macros across the chip canvas. The placement solutions must place cells with connecting nets near each other to allow the device to meet timing, but there is a limit to how dense any region of the canvas can be. We utilize our machine learning technique to provide end-of-life timing data for the longest critical paths to a standard placement tool. Without modifying the underlying placement algorithm, we are able to produce NBTI-aware designs adapted to the anticipated NBTI degradation.

Notes

Rights