Bootstrap aggregating branch predictors

Ibrahim Burak Karsli, University of Rhode Island

Abstract

After over two decades of extensive research on branch prediction, branch mispredictions are still an important performance and power bottleneck for today's aggressive processors. All this research has introduced very sophisticated and accurate branch predictor designs, TAGE predictor being the current-state-of-art. ^ In this work, instead of directly improving on individual predictor's accuracy, I focus on an orthogonal statistical method called bootstrap aggregating, or bagging. Bagging is used to improve overall accuracy by using an ensemble of predictors, which are trained with slightly different data sets. Each predictor (can be same or different predictors) is trained using a resampled (with replacement) training set (bootstrapping). Then, the final prediction is simply provided by weighting or majority voting (aggregating). This work shows that applying bagging improves performance more than simply increasing size.^

Subject Area

Engineering, Computer|Engineering, General

Recommended Citation

Ibrahim Burak Karsli, "Bootstrap aggregating branch predictors" (2014). Dissertations and Master's Theses (Campus Access). Paper AAI1562249.
http://digitalcommons.uri.edu/dissertations/AAI1562249

Share

COinS