Skip navigation

Standard Performance Evaluation Corporation

Facebook logo LinkedIn logo Twitter logo Google+ logo
 
 

 

SPEC/GWPG Frequently Asked Questions

What is SPEC/GWPG and what does it do?
SPEC/GWPG (Graphics & Workstation Performance Group) is a non-profit organization that sponsors the development of standardized, application-based benchmarks that have value to the vendor, research and user communities. For more, see: http://www.spec.org/gwpg/publish/overview.html.

What benchmarking projects are active under SPEC/GWPG?
The Graphics Performance Characterization (SPECgpcSM) group establishes graphics performance benchmarks for systems running under OpenGL and other application programming interfaces (APIs). The group's SPECviewperf® benchmark is the most popular standardized software worldwide for evaluating performance based on professional-level CAD/CAM, digital content creation, and visualization applications. For more, see http://www.spec.org/gwpg/gpc.static/overview.html.

The Application Performance Characterization (SPECapcSM) group provides a broad-ranging set of standardized benchmarks for professional-level graphics and workstation applications. For more, see http://www.spec.org/gwpg/apc.static/apc_overview.html.

The Workstation Performance Characterization (SPECwpc) group is creating a benchmark that measures the performance of workstations running algorithms used in popular professional applications, but without requiring the full application and associated licensing to be installed on the system under test. For more, see http://www.spec.org/gwpg/wpc.static/wpc_overview.html.

Why are some of the same applications (Pro/E, 3ds max, NX) included in both SPECapc and SPECviewperf benchmark suites?
The two benchmark suites have different purposes and different types of users. SPECapc benchmarks are designed to measure, as much as possible, total performance for graphics and workstation applications. They typically include tests for graphics, I/O and CPU performance, and they require that the user has a license for the application on which they are based. SPECapc benchmarks are based on large models and complex interactions, and tend to take a long time to run.

Viewsets, the benchmarks that run on SPECviewperf, exercise only the graphics functionality of the application. Because it strips away application overhead, SPECviewperf allows direct performance comparisons of graphics hardware. SPECviewperf does not require users to have licenses of the applications on which its viewsets are based. This makes it more accessible to a wider range of users. SPECviewperf is also easier to use and faster to run than SPECapc benchmarks.

How can someone run SPECviewperf and/or SPECapc benchmarks and submit results for review and publication on the SPEC web site?
SPEC/GWPG provides a paid process that allows those who are not members of the SPECgpc or SPECapc project groups to submit results for publication on this web site. For more information, see http://www.spec.org/gwpg/publish/nonmember.html.

Whether submitted for publication on the SPEC web site or not, anyone publishing results for SPEC/GWPG benchmarks must comply with the benchmark license and run rules.

I cannot find benchmark results on the SPEC site for a vendor or systems configuration that interests me. How can I get the results I'm seeking?
Submitting benchmark results for publication on the SPEC web site is voluntary. If you are seeking specific results that are not published on the site, you can try the following:

  • Contact SPECgpc <gpcinfo@spec.org> to inquire about SPECviewperf results or SPECapc <gpcapc-info@spec.org> to ask about application benchmark results. If the vendor is a member of the appropriate group, a representative should be able to answer your question, and perhaps even provide some results.
  • Conduct a web search to see if any of the major publications or web sites that use SPEC/GWPG benchmarks have published the test results you are seeking.
  • If you have a customer service contact for the hardware vendor or ISV, relay your request to him or her.
  • If it is feasible, run your own benchmark tests using a SPECapc benchmark or SPECviewperf.

Who do I contact if I have trouble running SPECviewperf or a SPECapc benchmark?
Contact SPECgpc <gpcinfo@spec.org> for problems with SPECviewperf or SPECapc <gpcapc-info@spec.org> for problems with application-based benchmarks.

How do I get my benchmark considered for adoption by SPECgpc or SPECapc?
Send a description of the benchmark and links to information and/or downloads to the appropriate e-mail alias above.

Why should I trust results from a vendor-sponsored benchmark organization? Isn't this a bit like the fox guarding the chicken coop?
Industry vendors have the highest level of interest in developing credible benchmarks. Without good performance evaluation software, vendors would not be able to do valid system comparisons when developing new products, or gain recognition from the trade media and public for significant technology advances.

Members of SPECgpc and SPECapc do not publish benchmarks in a void -- they develop the benchmarks based on interaction with user groups, publications, application developers and others. Benchmarks go through testing from different vendors working on different operating systems and environments before they are released.

Contrary to some beliefs, "vendor-driven" benchmarks are probably the most objective, as they are not subject to personal biases. The competitive nature of vendors provides a natural system of checks and balances that helps ensure objective, repeatable benchmarks.