Measuring Computer Performance SUMMARY Copyright 2004 David J













- Slides: 13
Measuring Computer Performance SUMMARY Copyright 2004 David J. Lilja 1
Fundamental Solution Techniques l Measurement l Simulation l Analytical modeling Copyright 2004 David J. Lilja 2
Performance Metrics l l l Characteristics of good metrics Processor and system metrics Speedup and relative change Copyright 2004 David J. Lilja 3
Measurement Tools and Techniques l l l Strategies Interval timers Program profiling Tracing Indirect measurement Copyright 2004 David J. Lilja 4
Statistical Interpretations of Measured Data l l What do all of these means mean? Sources of measurement errors Confidence intervals Statistically comparing alternatives Copyright 2004 David J. Lilja 5
Design of Experiments l l l Terminology One-factor ANOVA Two-factor ANOVA Generalized m-factor experiments Fractional factorial designs l l m 2 n designs Multifactorial designs l Plackett and Burman Copyright 2004 David J. Lilja 6
Simulation l l l Types of simulations Random number generation Verification and validation Copyright 2004 David J. Lilja 7
References l Sources of additional information Copyright 2004 David J. Lilja 8
Performance Bookshelf l Suggested books on computer systems performance measurement and analysis l l l Comprehensive performance books Experimental design Modeling and queuing analysis Simulation and random number generation Software suggestions and reference books http: //labq. com/bookstore. shtml Copyright 2004 David J. Lilja 9
References Comprehensive Performance Analysis l David J. Lilja, Measuring Computer Performance: A Practitioner's Guide, Cambridge University Press, 2000, http: //labq. com/perf-book. shtml. Experimental Design l l l Joshua J. Yi, David J. Lilja, and Douglas M. Hawkins, “A Statistically Rigorous Approach for Improving Simulation Methodology, ” International Symposium on High-Performance Computer Architecture (HPCA), February, 2003. R. Plackett and J. Burman, “The Design of Optimum Multifactorial Experiments, ” Biometrika, Vol. 33, Issue 4, June, 1946, pp. 305 -325. D. C. Montgomery, Design and Analysis of Experiments (5 th ed), Wiley & Sons, 2000, http: //labq. com/bookstore. shtml#experiments. Copyright 2004 David J. Lilja 10
References Minne. SPEC l l AJ Klein. Osowski and David J. Lilja, “Minne. SPEC: A New SPEC Workload for Simulation-Based Computer Architecture Research, ” Computer Architecture Letters, Vol. 1, June, 2002, pp. 10 -13. http: //www. arctic. umn. edu/~lilja/minnespec/ L. Eeckhout et al, “Designing Computer Architecture Workloads, ” IEEE Computer, Feb. , 2003, pp. 65 -71. Sampling l l J. Haskins and K. Skadron, “Minimal Subset Evaluation: Rapid Warm-up for Simulated Hardware State, ” Intl. Conf. Computer Design, 2001. R. E. Wunderlich, T. F. Wenisch, B. Falsafi, J. C. Hoe, “SMARTS: Accelerating Microarchitecture Simulation via Rigorous Statistical Sampling, ” Intl. Symp. Computer Architecture, 2003, pp. 84 -95. Copyright 2004 David J. Lilja 11
References Sim. Point l T. Sherwood, E. Perelman, G. Hamerly, and B. Calder, “Automatically Characterizing Large Scale Program Behavior, ” Intl. Conf. Architectural Support for Programming Languages and Operating Systems, 2002. Copyright 2004 David J. Lilja 12
“Measurements are not to provide numbers but insights. ” Ingrid Bucher Questions? Copyright 2004 David J. Lilja 13