Benchmarks
Tools
Order Benchmarks
SPEC
Mirror Sites
Resources
|
The 2023 SPEC Impact Award Winners
This award is given to individuals or a team that made significant contributions to a Group or a Committee.
Technical Contribution
Daniel Bowers, Amazon Web Services
A member of the SPEC CPU subcommittee, Daniel Bowers has been fighting to preserve the SPEC CPU tradition of porting the benchmarks to a variety of environments. In particular, some have suggested that perhaps the SPEC CPU v8 benchmark should not bother to support Microsoft Windows on the grounds that SPEC has not published any SPEC CPU Windows results in the last six years. During 2022, Daniel contributed 16 portability reports for Windows, using both Microsoft Visual Studio and MinGW. He has also analyzed faults and suggested specific fixes to resolve issues.
Technical Leadership
Jessica Heerboth, NVIDIA
Jessica Heerboth enthusiastically assumed the role as the main driver of porting the old SPECapc for 3ds Max benchmark to the new SPECapc for 3ds Max 2020 benchmark. To do this, Jessica had to start with a benchmark designed for 2015 hardware, which meant she had to painstakingly examine existing workloads and determine what made sense to remove and what could be ported to the new version. There are also differences in the way scripts are executed now versus how they were run in 2015. In addition, differences in the way some scripts were encrypted meant they couldn't be resolved by examining code in a text document. As a result, Jessica spent significant time working with Autodesk to determine which workloads could be used in the new version, and that they would make sense to the users. She also created test builds that incorporated feedback from various members of the SPECapc committee.
Technical Contribution
Ravi Jagannadhan, AMD
Pallavi Mehrotra, Intel
Erik Niemeyer, Intel
The SPECapc for Maya 2023 benchmark introduces a significant new animation feature as part of its workload, which allows some, or all, of its data to be cached, potentially requiring tremendous amounts of hardware resources. Pallavi Mehrotra, Erik Niemeyer and Ravi Jagannadhan conducted extensive testing and troubleshooting to determine the hardware requirements and configurations needed to ensure the feature would function correctly, which is key for both vendors and buyers to understand. Without their work, the Graphics and Workstation Performance Group (GWPG) would not have been able to include this important new feature in the SPECapc for Maya 2023 benchmark.
Pallavi and Erik brought suggestions on which workload to use as a valid example of animation caching, and they continued to work through issues with variability. Ravi drew on his application expertise to understand how Maya users actually use this feature and why. The trio also focused on determining the right name for the benchmark result so users would easily understand what it was and its relevance. To pull off this monumental task, they collaborated on numerous occasions despite working for different companies. They also worked with Autodesk and frequently met with Allen Jensen, SPECapc Vice Chair, who was writing the code for the benchmark.
Ravi Jagannadhan
Pallavi Mehrotra
Erik Niemeyer
Technical Contribution
Mahesh Madhav, Ampere Computing LLC
Mahesh Madhav surprised the SPEC CPU Subcommittee by bringing 18 new benchmark candidates into the SPEC CPUv8 development effort, comprising more than 8,000 modules and nearly 4 million lines of code. Mahesh has been highly responsive to problem reports, providing timely advice and making adjustments to resolve errors. In addition, he has introduced a visual analysis method to the subcommittee that it had not used before — "self-similarity plots."
Technical Leadership
Michele Tucci, University of L'Aquila
Michele Tucci started attending the Research Group's (RG) DevOps Performance Working Group meetings in December 2021. Since then, he has significantly impacted the RG DevOps Performance Working Group and the Research Group as a whole. Michele took a leading role in and made significant contributions to two RG DevOps Performance subgroups: Performance Change Point Detection and Search-based Software Performance Engineering. He created the first version of a curated dataset of real-world workload performance changes to assess the quality of current change detection methods. In RG, Michele accepted the RG Steering Committee's nomination to serve as RG Release Manager, and again quickly found his role and made significant contributions.
|