Applying Perceptrons to Speculation in Computer Architecture

dc.contributor.advisorFranklin, Manojen_US
dc.contributor.authorBlack, Michael Daviden_US
dc.contributor.departmentElectrical Engineeringen_US
dc.contributor.publisherDigital Repository at the University of Marylanden_US
dc.contributor.publisherUniversity of Maryland (College Park, Md.)en_US
dc.date.accessioned2007-06-22T05:32:26Z
dc.date.available2007-06-22T05:32:26Z
dc.date.issued2007-04-05
dc.description.abstractSpeculation plays an ever-increasing role in optimizing the execution of programs in computer architecture. Speculative decision-makers are typically required to have high speed and small size, thus limiting their complexity and capability. Because of these restrictions, predictors often consider only a small subset of the available data in making decisions, and consequently do not realize their potential accuracy. Perceptrons, or simple neural networks, can be highly useful in speculation for their ability to examine larger quantities of available data, and identify which data lead to accurate results. Recent research has demonstrated that perceptrons can operate successfully within the strict size and latency restrictions of speculation in computer architecture. This dissertation first studies how perceptrons can be made to predict accurately when they directly replace the traditional pattern table predictor. Several weight training methods and multiple-bit perceptron topologies are modeled and evaluated in their ability to learn data patterns that pattern tables can learn. The effects of interference between past data on perceptrons are evaluated, and different interference reduction strategies are explored. Perceptrons are then applied to two speculative applications: data value prediction and dataflow critical path prediction. Several new perceptron value predictors are proposed that can consider longer or more varied data histories than existing table-based value predictors. These include a global-based local predictor that uses global correlations between data values to predict past local values, a global-based global predictor that uses global correlations to predict past global values, and a bitwise predictor that can use global correlations to generate new data values. Several new perceptron criticality predictors are proposed that use global correlations between instruction behaviors to accurately determine whether instructions lie on the critical path. These predictors are evaluated against local table-based approaches on a custom cycle-accurate processor simulator, and are shown on average to have both superior accuracy and higher instruction-per-cycle performance. Finally, the perceptron predictors are simulated using the different weight training approaches and multiple-bit topologies. It is shown that for these applications, perceptron topologies and training approaches must be selected that respond well to highly imbalanced and poorly correlated past data patterns.en_US
dc.format.extent2523382 bytes
dc.format.extent435938 bytes
dc.format.extent124782 bytes
dc.format.mimetypeapplication/pdf
dc.format.mimetypeapplication/octet-stream
dc.format.mimetypeapplication/octet-stream
dc.identifier.urihttp://hdl.handle.net/1903/6725
dc.language.isoen_US
dc.subject.pqcontrolledEngineering, Electronics and Electricalen_US
dc.subject.pqcontrolledComputer Scienceen_US
dc.subject.pquncontrolledperceptronen_US
dc.subject.pquncontrolledspeculationen_US
dc.subject.pquncontrolledvalue predictionen_US
dc.subject.pquncontrolledcritical-pathen_US
dc.subject.pquncontrolledbranch predictionen_US
dc.subject.pquncontrolledcomputer architectureen_US
dc.titleApplying Perceptrons to Speculation in Computer Architectureen_US
dc.typeDissertationen_US

Files

Original bundle
Now showing 1 - 3 of 3
Loading...
Thumbnail Image
Name:
umi-umd-4201.pdf
Size:
2.41 MB
Format:
Adobe Portable Document Format
No Thumbnail Available
Name:
mysimulator.c
Size:
425.72 KB
Format:
Unknown data format
No Thumbnail Available
Name:
scripts.sh
Size:
121.86 KB
Format:
Unknown data format