Skip navigation

Standard Performance Evaluation Corporation

Facebook logo LinkedIn logo Twitter logo


worldcar model from the SPECapc for Creo 9 benchmark
SPEC Delivers a Major Update to the SPECapc for Creo 9 Performance Benchmark

By Jessica Heerboth, SPECapc Committee Chair

I'm pleased to announce the availability of the new SPECapc for Creo 9 benchmark, with updated models to support computing systems running the PTC Creo 9 3D CAD solution, now featuring generative design, real-time simulation, advanced manufacturing, industrial IoT and augmented reality. This major update to the benchmark also includes new and updated CPU and GPU test cases.

The benchmark has undergone a significant transformation with three major upgrades since it first released for Creo 3, and we are particularly gratified with being able to work directly with PTC on this update. In addition to new test cases that exercise features added to Creo over the last few releases, we have significantly enhanced the benchmark's interface to make it far more user-friendly.

Read more


A girl working on a laptop
The SPECviewperf® Benchmark — A User's Story

By Ross Cunniff, SPECgpc Committee Chair

Emma is a tech enthusiast who develops product designs and plays games on the same workstation, which includes a CPU and GPU that were mid-range when she bought the system in 2018. Over the years, however, she's been far less pleased with the performance, noticing significant lag when trying to manipulate models in SolidWorks, the 2D and 3D product development application that engineers and designers use to create and collaborate on innovative product designs.

When she and a group of friends began designing a high-tech go-kart, things started to get painful. Every time they tried to increase the complexity of the model by exploring how different king pins and rack and pinion gear ratios would work, the system just bogged down. The processes would complete eventually, but it just wasn't practical to keep using her existing rig for the go-kart project.

Read more


Dr. Lizy Kurian John, University of Texas at Austin
SPEC Member Professor Lizy Kurian John Receives Joe J. King Professional Engineering Achievement Award

By John Henning, SPEC CPU Committee Secretary

I am extremely pleased to congratulate Professor Lizy Kurian John, IEEE Micro Editor-in-Chief and Truchard Foundation Chair at the Department of Electrical and Computer Engineering, University of Texas at Austin, on receiving the Joe J. King Professional Engineering Achievement Award. Professor John is well known within SPEC for her contributions to SPEC CPU, and in turn, her contribution to new CPU processor design.

New chip designs take years, requiring very large engineering investments, and SPEC benchmarks provide essential guidance for this engineering work. Dr. John has collaborated with the SPEC CPU Committee since 2004. SPEC engineers have provided her with low-level hardware profiles and she, along with her PhD students, have applied Principal Component Analysis (PCA) to produce benchmark "clusters" that SPEC has considered when selecting which benchmarks to include in SPEC products.

Read more


ICPE 2023 attendees
It's a Wrap — Successful 14th Annual ICPE 2023 Marks Return of In-Person Event

By Petr Tuma, ICPE 2023 PC Co-Chair, and Marco Vieira, CPE 2023 General Co-Chair

We are very pleased to report on the success of ICPE 2023 — the 14th annual ACM/SPEC International Conference on Performance Engineering — which took place in Coimbra, Portugal, during April. The conference is an annual event where researchers and practitioners meet to present and discuss the latest results from both industry and academia related to software and systems performance.

This year's event marked an exciting return to an in-person conference, and nearly 150 attendees enjoyed three keynote speeches, 28 research presentations, seven data challenge presentations, a range of workshops and more.

Read more


Screenshot of a model from the SPECapc for 3ds Max benchmark
SPEC Adds Benchmark Search Program for Graphics and Workstation Performance Group

By Jessica Heerboth, SPECapc Committee Chair

I'm excited to announce that SPEC has kicked-off a third Benchmark Search Program. As noted in a blog post about our first two Search Programs, SPEC believes that the most effective computing benchmarks are based on how various user communities run actual applications. To enable us to do this, Search Programs encourage users outside of SPEC to contribute applications, workloads, or models that will enable us to build more comprehensive and more applicable benchmarks, which in turn will better serve their communities. The new Benchmark Search Program is for the SPEC Graphics and Workstation Performance Group (GWPG).

Read more


screenshot from the Sol and Sollette model
SPECapc for Maya 2023 Benchmark – Measuring Performance of the Latest High Performance Workstations

By Jessica Heerboth, SPECapc Committee Chair

In October of last year, SPEC released the SPECapc® for Maya 2023 benchmark, which offers application performance measurement for workstations running Autodesk Maya 2023. Maya is the 3D animation and visual effects software used by top artists in the industry to create realistic characters and stunning visual effects.

Workstation hardware performance has reached unprecedented levels, and the updates in the SPECapc for Maya 2023 benchmark include new and more complicated workloads and larger models compared to the previous version. With the SPECapc for Maya 2023 benchmark, workstation vendors will be better able to assess and compare their ability to meet the performance needs of Maya 2023 users, while users will be better able to determine the best workstations to purchase for their needs. The demand for the SPECapc for Maya 2023 benchmark is reflected in its growing popularity. More than 100 organizations have already downloaded the benchmark, with 61 downloads in just the first 10 weeks of 2023.

Read more


traditional benchmarks vs. system benchmarks
Evolving Trends in Cloud Market Call for New Benchmarks

By Sundar Iyengar, SPEC Cloud Committee Chair and Ramesh Illikkal, Committee Member

Over the last few years, the cloud market has grown in its depth and breadth of offerings. From its simple beginnings, when on-premises workloads and applications could be run on instances rented on the cloud, the market has moved to designing cloud-native applications that run on disaggregated hardware.

Software architecture is shifting from monolithic architecture to distributed microservices architecture, increasing workload complexity. The underlying cloud system architecture is becoming more heterogenous and disaggregated, comprising a mix of CPUs, infrastructure processing units and special-purpose accelerators. The confluence of these trends has led to new urgencies for innovations in the underlying infrastructure. Unfortunately, the benchmarks that are being widely used by the industry to drive architecture features and software/hardware co-optimizations have been left behind in this fast-paced transformation.

Read more


Several people with laptops working together
SPEC Search Programs — How You Can Contribute to the Future of Computing

By Mathew Colgrove, SPEC Communications Committee Chair

SPEC, a nonprofit organization, develops benchmarks that evaluate the performance and energy consumption of the newest generation of computing systems. These benchmarks help hardware vendors gauge how their products perform against the competition and target the areas that need improvement. They also enable buyers to make reliable comparisons between products, so they can purchase the right ones for their needs.

SPEC believes that the most effective computing benchmarks are developed based on how various user communities run actual applications. To enable us to do this, SPEC regularly conducts Search Programs that encourage those outside of SPEC to contribute applications, workloads, or models that will enable us to build more comprehensive and more applicable benchmarks that will better serve their communities.

Read more


Crossword puzzle illustration
From performance rating to sustainability: SPEC tackles a key global challenge

By Klaus-Dieter Lange, Chair, International Standards Group

Since 2008, SPEC has developed and maintained the SPECpower_ssj benchmark to evaluate the power and performance characteristics of server-class computers. Like other SPEC benchmarks, the SPECpower benchmark enables vendors to make realistic comparisons between their products and those of their competitors, while allowing buyers to determine the best solutions for their use cases. As sustainability has become an increasingly important global issue, the SPECpower benchmark has played a critical role in enabling and encouraging vendors to improve the energy efficiency of their products. Over the last few years, the growing focus on sustainability has also led to an important new direction for SPEC.

Read more


Screenshot of a model from the Solidworks 2022 benchmark
SPEC Releases SPECapc for Solidworks 2022 Benchmark

By Trey Morton, SPEC Application Performance Characterization Committee Chair

I'm excited that SPEC has released the SPECapc for Solidworks 2022 benchmark, which offers application performance measurement for workstations running Dassault Systèmes Solidworks 2022. An industry-leading portfolio of 3D design and engineering applications, Solidworks is used by millions of innovators worldwide, and the latest version delivers hundreds of enhancements that accelerate innovation and streamline and speed up the product development process.

Read more


A rendered model of a jet airplane
New SPECviewperf 2020 v3.1 Benchmark for Measuring Graphics Performance

By Ross Cunniff, SPECgpc Chair

Graphics requirements continue to evolve rapidly, and the SPEC® Graphics Performance Committee has responded with the release of the SPECviewperf 2020 v3.1 benchmark, a key update to the worldwide standard for measuring graphics performance. We've added support for Microsoft Windows 11, which is now being installed by OEMs on most new PCs. The new benchmark also offers enhanced GUI support for 4K, making it clearer when 3840x2160 resolution has been selected and indicating in results files that a benchmark was run at 4K resolution.

Read more


Graphs/datapoints against a grid background
SPEC Machine Learning Committee to Develop Vendor-Agnostic Benchmark to Measure End-to-End Performance for Machine Learning Training and Inference Tasks

By Arthur Kang, SPEC ML Committee Chair

The SPEC Open Systems Group (OSG) reached an important milestone this year with the establishment of the Machine Learning Committee, which is developing practical methodologies for benchmarking artificial intelligence (AI) and machine learning (ML) performance in the context of real-world platforms and environments.

Read more


Server room/grid and lines
SPEChpc Releases New v1.1 Benchmark

By Mathew Colgrove, SPEC High Performance Group Release Manager

With the release of the SPEChpc 2021 v1.0 benchmark last October, we have seen a great response from the HPC community. More than 100 institutions have received no-cost, non-commercial licenses, including large research facilities, universities, and even a few high schools. Beyond measuring performance of HPC systems, users have used SPEChpc benchmarks for educational purposes and installation testing, as well as to find node configuration issues, to name a few. Paraphrasing Nicole Hemsoth's The Next Platform article, SPEChpc is the benchmark real-world HPC deserves.

Read more


Screenshot of a model in the SPECapc for 3ds Max 2020 benchmark
New SPECapc® for 3ds Max Benchmark for Systems Running Autodesk 3ds Max 2020

By Trey Morton, SPECapc Committee Chair

Earlier this year, the SPEC® Application Performance Committee (SPECapc) released the SPECapc for 3ds Max 2020 benchmark for systems running the latest version of Autodesk 3ds Max. This version of the SPEC benchmark replaces the SPECapc for 3ds Max 2015 benchmark.

With 3ds Max 2015 retired, the Autodesk community needs an industry-standard benchmark for an updated version of the application. 3ds Max is recognized as a key part of gaming, visualization, architecture, and many more industry segments, and SPEC realizes the importance of continuing to be able to measure performance without gaps. The SPECapc for 3ds Max 2020 benchmark is an interim release that enables the community to continue benchmarking their systems while SPEC launches a ground-up redesign of the benchmark software with new content. As with all of the SPECapc benchmarks, we are always on the lookout for community content that could be utilized in future versions of the benchmark.

Read more


Screenshot of a model in the SPECviewperf 2020 V3 benchmark
SPEC Releases SPECviewperf 2020 v3.0 Benchmark, Adds Linux Edition

By Ross Cunniff, SPECgpc Chair

The SPEC® Graphics Performance Characterization (SPECgpc) group put tremendous effort into updating the SPECviewperf® benchmarks over the last year, and I'm excited today to discuss some of the new features. The SPECviewperf® 2020 v3.0 benchmark and the SPECviewperf® 2020 v3.0 Linux Edition benchmark - industry-standard benchmarks for measuring graphics performance based on professional applications - measure the 3D graphics performance of systems running under the OpenGL and DirectX application programming interfaces. The benchmark workloads, called viewsets, represent graphics content and behavior from actual workstation-class applications, without the need to install the applications themselves.

Read more


Desktop calendar; Photo by Eric Rothermel on Unsplash
SPEC 2021 Year in Review

By David Reiner, President

As all of us at SPEC continue into a very busy 2022, I'd like to reflect on what we accomplished during 2021, which, despite the continued headwinds from the pandemic, was an exciting and productive year at SPEC.

Read more


Computer screen with multi-colored code; Photo by Markus Spiske on Unsplash
Searching for Workloads — Help Shape the Next SPEC CPU Benchmark Suites

By James Bucek, SPEC CPU Chair

SPEC exists to develop benchmarks that vendors, businesses, and individuals can trust to make key purchasing decisions based on fair comparisons between different systems or solutions. One way we do this is bringing together representatives from a variety of companies, including competitors, to design benchmarks that fairly represent real-world workloads.

For SPEC CPU, an equally important aspect of creating better benchmarks is that we base them on actual applications or workloads in use today, not just representative workloads. This separates SPEC CPU from many of the common micro-benchmarks and gives a user more confidence in the workloads behind the benchmark, as well as greater assurance that the results produced are standard and reproducible, allowing for more accurate comparisons between solutions.

Read more


Illustration of the five workloads that comprise a tile
New SPECvirt® Datacenter 2021 Benchmark, for Complex, Multi-Host Environments

By David Schmidt, Virtualization Committee Chair

The SPEC Virtualization Committee has released the SPECvirt® Datacenter 2021 benchmark, a new multi-host benchmark for measuring the performance of a scaled-out datacenter.

While distributed computing has become the dominant infrastructure in datacenters today – providing reliability, availability, serviceability and security – virtualization is the key to optimizing this infrastructure and providing increased flexibility and application availability, while also reducing costs through server and datacenter consolidation. As such, suppliers and buyers require a fair, vendor-agnostic tool for measuring the performance of the solutions they use to power these more complex, multi-host environments.

Read more


SPEC SERT Suite performance bar chart
Server Performance — At what cost?

By Klaus Lange, SPECpower Committee Chair

From their hardware and firmware, to their software and applications, modern servers are increasingly complex. When one considers the incredible variety of computational and data-centric tasks that servers are designed and deployed to provide, the provisioning of these machines cannot be done properly or completely without accounting for energy efficiency. Beyond the initial hardware and software investments, beyond the costs to house these servers, what level of power-efficient performance can be expected and how does that compare across different architectures and configurations?

Read more


SPECviewperf 2020 splash screen
SPECviewperf® 2020 v2.0, Keeping Pace With an Evolving Industry

By Ross Cunniff, SPECgpc Chair

The Graphics and Workstation Performance Group (GWPG) proudly announced the SPECviewperf® 2020 v2.0 benchmark in June, just eight months after the initial release of the 2020 version. In addition to the time we invested in ensuring SPECviewperf keeps pace with our evolving industry, SPECviewperf 2020 2.0 truly reflects the best of SPEC’s ability to leverage competing members to create and update a benchmark that benefits everyone, the vendors, enterprise customers and end users alike.

Read more