SPEC/GWPG Frequently Asked Questions
What is SPEC/GWPG and what does it do?
SPEC/GWPG (Graphics & Workstation Performance Group) is a non-profit organization that sponsors the development
of standardized, application-based benchmarks that have value
to the vendor, research and user communities. For more, see: http://www.spec.org/gwpg/publish/overview.html.
What benchmarking projects are active under SPEC/GWPG?
The Graphics Performance Characterization (SPECgpc®)
group establishes graphics performance benchmarks
for systems running under OpenGL, DX and potentially other application programming interfaces
(APIs). The group's SPECviewperf® benchmark is the most popular
standardized software worldwide for evaluating performance based
on professional-level CAD/CAM, digital content creation, and visualization applications. For more, see http://www.spec.org/gwpg/gpc.static/overview.html.
The Application Performance Characterization (SPECapc®)
group provides a broad-ranging set of standardized
benchmarks for professional-level graphics and workstation applications. For more, see
http://www.spec.org/gwpg/apc.static/apc_overview.html.
The Workstation Performance Characterization (SPECwpcSM) group
has created a benchmark that measures the performance of workstations running algorithms used in popular professional applications, but without requiring the full application and associated licensing to be installed on the system under test. For more, see http://www.spec.org/gwpg/wpc.static/wpc_overview.html.
Why are some of the same applications (Pro/E, 3ds max, NX)
included in the SPECviewperf, SPECapc and SPECwpc benchmark suites?
The three benchmark suites have different purposes and different
types of users.
SPECapc benchmarks are designed to measure, as
much as possible, total performance for graphics and workstation applications.
They typically include tests for graphics, I/O and CPU performance,
and they require that the user has a license or trial version for the application
on which they are based. SPECapc benchmarks are based on large
models and complex interactions, and tend to take a long time
to run.
SPECviewperf exercises only
the graphics functionality of the application. Because it strips
away application overhead, SPECviewperf allows direct performance
comparisons of graphics hardware. SPECviewperf does not require
users to have licenses of the applications on which its viewsets
are based. This makes it more accessible to a wider range of users.
SPECviewperf is also easier to use and faster to run than SPECapc
benchmarks.
The SPECwpc benchmark also does not require users to have licenses for the applications on which its workloads are based. Some of these workloads are viewsets found in SPECviewperf; others are unique to SPECwpc and represent a range of applications and industries. SPECwpc measures performance for a comprehensive breadth of operations performed by modern workstations running professional-level applications.
How can someone run SPECviewperf, SPECapc and/or SPECwpc benchmarks
and submit results for review and publication on the SPEC website?
SPEC/GWPG provides a paid process that allows those who are
not members of the SPECgpc, SPECapc or SPECwpc project groups to submit
results for publication on this website. For more information,
see http://www.spec.org/gwpg/publish/nonmember.html.
Whether submitted for publication on the SPEC website or not, anyone publishing results for SPEC/GWPG benchmarks must
comply with the benchmark license and run rules.
I cannot find benchmark results on the SPEC website
for a vendor or systems configuration that interests me. How can
I get the results I'm seeking?
Submitting benchmark results for publication on the SPEC website is voluntary. If you are seeking specific results that
are not published on the site, you can try the following:
- Conduct a web search to see if any of the major publications
or websites that use SPEC/GWPG benchmarks
have published the test results you are seeking.
- If you have a customer service contact for the hardware vendor
or ISV, relay your request to him or her.
- If it is feasible, run your own benchmark tests using a SPECapc, SPECviewperf or SPECwpc benchmark.
Who do I contact if I have trouble running a SPECviewperf, SPECapc or SPECwpc benchmark?
Contact info@spec.org to report your issue.
How do I get my benchmark considered for adoption by SPECgpc, SPECapc or SPECwpc?
Send a description of the benchmark and links to information and/or
downloads to info@spec.org.
Why should I trust results from a vendor-sponsored benchmark
organization? Isn't this a bit like the fox guarding the chicken
coop?
Industry vendors have the highest level of interest in developing
credible benchmarks. Without good performance evaluation software,
vendors would not be able to do valid system comparisons when
developing new products, or gain recognition from the trade media
and public for significant technology advances.
Members of SPECgpc, SPECapc and SPECwpc do not publish benchmarks in a
void -- they develop the benchmarks based on interaction with user
groups, publications, application developers and others. Benchmarks
go through testing from different vendors working on different
operating systems and environments before they are released.
Contrary to some beliefs, "vendor-driven" benchmarks
are probably the most objective, as they are not subject to personal
biases. The competitive nature of vendors provides a natural system
of checks and balances that helps ensure objective, repeatable
benchmarks.
|