Skip navigation

Standard Performance Evaluation Corporation

Facebook logo LinkedIn logo Twitter logo
 
 

SPEC Blog

traditional benchmarks vs. system benchmarks
Evolving Trends in Cloud Market Call for New Benchmarks

By Sundar Iyengar, SPEC Cloud Committee Chair, and Ramesh Illikkal, Committee Member

Over the last few years, the cloud market has grown in its depth and breadth of offerings. From its simple beginnings, when on-premises workloads and applications could be run on instances rented on the cloud, the market has moved to designing cloud-native applications that run on disaggregated hardware.

Software architecture is shifting from monolithic architecture to distributed microservices architecture, increasing workload complexity. The underlying cloud system architecture is becoming more heterogenous and disaggregated, comprising a mix of CPUs, infrastructure processing units and special-purpose accelerators. The confluence of these trends has led to new urgencies for innovations in the underlying infrastructure. Unfortunately, the benchmarks that are being widely used by the industry to drive architecture features and software/hardware co-optimizations have been left behind in this fast-paced transformation.

Read more

 

Several people with laptops working together
SPEC Search Programs — How You Can Contribute to the Future of Computing

By Mathew Colgrove, SPEC Communications Committee Chair

SPEC, a nonprofit organization, develops benchmarks that evaluate the performance and energy consumption of the newest generation of computing systems. These benchmarks help hardware vendors gauge how their products perform against the competition and target the areas that need improvement. They also enable buyers to make reliable comparisons between products, so they can purchase the right ones for their needs.

SPEC believes that the most effective computing benchmarks are developed based on how various user communities run actual applications. To enable us to do this, SPEC regularly conducts Search Programs that encourage those outside of SPEC to contribute applications, workloads, or models that will enable us to build more comprehensive and more applicable benchmarks that will better serve their communities.

Read more

 

Crossword puzzle illustration
From performance rating to sustainability: SPEC tackles a key global challenge

By Klaus-Dieter Lange, Chair, International Standards Group

Since 2008, SPEC has developed and maintained the SPECpower_ssj benchmark to evaluate the power and performance characteristics of server-class computers. Like other SPEC benchmarks, the SPECpower benchmark enables vendors to make realistic comparisons between their products and those of their competitors, while allowing buyers to determine the best solutions for their use cases. As sustainability has become an increasingly important global issue, the SPECpower benchmark has played a critical role in enabling and encouraging vendors to improve the energy efficiency of their products. Over the last few years, the growing focus on sustainability has also led to an important new direction for SPEC.

Read more

 

Screenshot of a model from the Solidworks 2022 benchmark
SPEC Releases SPECapc for Solidworks 2022 Benchmark

By Trey Morton, SPEC Application Performance Characterization Committee Chair

I'm excited that SPEC has released the SPECapc for Solidworks 2022 benchmark, which offers application performance measurement for workstations running Dassault Systèmes Solidworks 2022. An industry-leading portfolio of 3D design and engineering applications, Solidworks is used by millions of innovators worldwide, and the latest version delivers hundreds of enhancements that accelerate innovation and streamline and speed up the product development process.

Read more

 

A rendered model of a jet airplane
New SPECviewperf 2020 v3.1 Benchmark for Measuring Graphics Performance

By Ross Cunniff, SPECgpc Chair

Graphics requirements continue to evolve rapidly, and the SPEC® Graphics Performance Committee has responded with the release of the SPECviewperf 2020 v3.1 benchmark, a key update to the worldwide standard for measuring graphics performance. We’ve added support for Microsoft Windows 11, which is now being installed by OEMs on most new PCs. The new benchmark also offers enhanced GUI support for 4K, making it clearer when 3840x2160 resolution has been selected and indicating in results files that a benchmark was run at 4K resolution.

Read more

 

Graphs/datapoints against a grid background
SPEC Machine Learning Committee to Develop Vendor-Agnostic Benchmark to Measure End-to-End Performance for Machine Learning Training and Inference Tasks

By Arthur Kang, SPEC ML Committee Chair

The SPEC Open Systems Group (OSG) reached an important milestone this year with the establishment of the Machine Learning Committee, which is developing practical methodologies for benchmarking artificial intelligence (AI) and machine learning (ML) performance in the context of real-world platforms and environments.

Read more

 

Server room/grid and lines
SPEChpc Releases New v1.1 Benchmark

By Mathew Colgrove, SPEC High Performance Group Release Manager

With the release of the SPEChpc 2021 v1.0 benchmark last October, we have seen a great response from the HPC community. More than 100 institutions have received no-cost, non-commercial licenses, including large research facilities, universities, and even a few high schools. Beyond measuring performance of HPC systems, users have used SPEChpc benchmarks for educational purposes and installation testing, as well as to find node configuration issues, to name a few. Paraphrasing Nicole Hemsoth's The Next Platform article, SPEChpc is the benchmark real-world HPC deserves.

Read more

 

Screenshot of a model in the SPECapc for 3ds Max 2020 benchmark
New SPECapc® for 3ds Max Benchmark for Systems Running Autodesk 3ds Max 2020

By Trey Morton, SPECapc Committee Chair

Earlier this year, the SPEC® Application Performance Committee (SPECapc) released the SPECapc for 3ds Max 2020 benchmark for systems running the latest version of Autodesk 3ds Max. This version of the SPEC benchmark replaces the SPECapc for 3ds Max 2015 benchmark.

With 3ds Max 2015 retired, the Autodesk community needs an industry-standard benchmark for an updated version of the application. 3ds Max is recognized as a key part of gaming, visualization, architecture, and many more industry segments, and SPEC realizes the importance of continuing to be able to measure performance without gaps. The SPECapc for 3ds Max 2020 benchmark is an interim release that enables the community to continue benchmarking their systems while SPEC launches a ground-up redesign of the benchmark software with new content. As with all of the SPECapc benchmarks, we are always on the lookout for community content that could be utilized in future versions of the benchmark.

Read more

 

Screenshot of a model in the SPECviewperf 2020 V3 benchmark
SPEC Releases SPECviewperf 2020 v3.0 Benchmark, Adds Linux Edition

By Ross Cunniff, SPECgpc Chair

The SPEC® Graphics Performance Characterization (SPECgpc) group put tremendous effort into updating the SPECviewperf® benchmarks over the last year, and I'm excited today to discuss some of the new features. The SPECviewperf® 2020 v3.0 benchmark and the SPECviewperf® 2020 v3.0 Linux Edition benchmark - industry-standard benchmarks for measuring graphics performance based on professional applications - measure the 3D graphics performance of systems running under the OpenGL and DirectX application programming interfaces. The benchmark workloads, called viewsets, represent graphics content and behavior from actual workstation-class applications, without the need to install the applications themselves.

Read more

 

Desktop calendar; Photo by Eric Rothermel on Unsplash
SPEC 2021 Year in Review

By David Reiner, President

As all of us at SPEC continue into a very busy 2022, I'd like to reflect on what we accomplished during 2021, which, despite the continued headwinds from the pandemic, was an exciting and productive year at SPEC.

Read more

 

Computer screen with multi-colored code; Photo by Markus Spiske on Unsplash
Searching for Workloads — Help Shape the Next SPEC CPU Benchmark Suites

By James Bucek, SPEC CPU Chair

SPEC exists to develop benchmarks that vendors, businesses, and individuals can trust to make key purchasing decisions based on fair comparisons between different systems or solutions. One way we do this is bringing together representatives from a variety of companies, including competitors, to design benchmarks that fairly represent real-world workloads.

For SPEC CPU, an equally important aspect of creating better benchmarks is that we base them on actual applications or workloads in use today, not just representative workloads. This separates SPEC CPU from many of the common micro-benchmarks and gives a user more confidence in the workloads behind the benchmark, as well as greater assurance that the results produced are standard and reproducible, allowing for more accurate comparisons between solutions.

Read more

 

Illustration of the five workloads that comprise a tile
New SPECvirt® Datacenter 2021 Benchmark, for Complex, Multi-Host Environments

By David Schmidt, Virtualization Committee Chair

The SPEC Virtualization Committee has released the SPECvirt® Datacenter 2021 benchmark, a new multi-host benchmark for measuring the performance of a scaled-out datacenter.

While distributed computing has become the dominant infrastructure in datacenters today – providing reliability, availability, serviceability and security – virtualization is the key to optimizing this infrastructure and providing increased flexibility and application availability, while also reducing costs through server and datacenter consolidation. As such, suppliers and buyers require a fair, vendor-agnostic tool for measuring the performance of the solutions they use to power these more complex, multi-host environments.

Read more

 

SPEC SERT Suite performance bar chart
Server Performance — At what cost?

By Klaus Lange, SPECpower Committee Chair

From their hardware and firmware, to their software and applications, modern servers are increasingly complex. When one considers the incredible variety of computational and data-centric tasks that servers are designed and deployed to provide, the provisioning of these machines cannot be done properly or completely without accounting for energy efficiency. Beyond the initial hardware and software investments, beyond the costs to house these servers, what level of power-efficient performance can be expected and how does that compare across different architectures and configurations?

Read more

 

SPECviewperf 2020 splash screen
SPECviewperf® 2020 v2.0, Keeping Pace With an Evolving Industry

By Ross Cunniff, SPECgpc Chair

The Graphics and Workstation Performance Group (GWPG) proudly announced the SPECviewperf® 2020 v2.0 benchmark in June, just eight months after the initial release of the 2020 version. In addition to the time we invested in ensuring SPECviewperf keeps pace with our evolving industry, SPECviewperf 2020 2.0 truly reflects the best of SPEC’s ability to leverage competing members to create and update a benchmark that benefits everyone, the vendors, enterprise customers and end users alike.

Read more