Skip navigation

Standard Performance Evaluation Corporation

Facebook logo LinkedIn logo Twitter logo

Submitting OSG Benchmark Results

Why submit results to SPEC?

SPEC's mission is to develop benchmarking standards and the corresponding software tools for fairly evaluating modern computer performance, and to provide an open forum for the results. SPEC invites you to help populate its public database of benchmark results. Licensees of the SPEC benchmarks may run the tests and submit results to SPEC. SPEC reviews the results and publishes them on its website for a processing fee. Having more results published on the SPEC website means having more information available to the community to make better comparisons. Help us level the playing-field; submit your results to SPEC.

Who may submit results and how much does it cost?

Any individual or group licensed with SPEC software (see our online product order form) may submit results to SPEC. There is a US$500 publication fee for each result submitted to SPEC, except in the case of an SPEC SFS 2014 or SPECstorage Solution 2020 result submission where the fee is US$1000. Payment is expected prior to publication of a reviewed result. SPEC OSG members and associates may submit an unlimited number of results for no charge, one of the many benefits afforded by membership to SPEC. If you expect that you will be submitting more than 12 results in a one year period of time, you should consider joining SPEC as a member or associate. For more info on SPEC membership, contact us!

How to submit results?

Results are submitted to SPEC via email. Results may be submitted at any time on any day of the week (but they are reviewed and published on a specific schedule). Each benchmark result undergoes a two (2) week review/publication cycle according to the following calendar.

In general, the result or raw result file should be submitted to SPEC as an attachment to an email to the appropriate benchmark submission drop. For a list of result submission email drops please send mail to

Additional and specific information on submitting results can be found in each benchmark's run and reporting rules and user guide documentation.

Submitting results and security mitigations

From time to time there can be reports of security mitigation concerns. SPEC's current approach with regard to security mitigations can be seen here: Specific questions about submission can be addressed to:

Publication of Results

Once OSG submissions complete their review and are considered accepted for publication at, they will be published (typically within one US business day) on with all the other results for the same benchmark. See for a listing of where results are published.

All results for currently supported benchmarks are tagged with the date that result was first published at This tagging is commonly implemented as an "Originally published" string as part of the footer information on each result page. Note: not all result page formats support post-processed footer updates; however, at least the HTML format for each result does support this tagging. If no such date string is seen in a specific result file, check the HTML version of the same result. Several SPEC benchmarks have explicit rules based on a "Date Published" definitions, and in those cases the result file incorporates the "Date Published" into the main content of the result page rather than just included as part of the page footer.

More info?

To better understand the SPEC result review process, please consult the SPEC/OSG Policy Document, Section: Guidelines for Result Submission and Review.

SPEC is accepting results from the following benchmarks/versions:

Benchmarks Versions Accepted
SPEC Cloud IaaS 2018 1.0, 1.1
SPEC CPU 2017 1.1, 1.1.5, 1.1.7, 1.1.8, 1.1.9
SPECjbb 2015 1.03
SPECjEnterprise 2018 Web Profile 1.0.0
SPECjEnterprise 2010 1.03
SPECjvm 2008 1.0
SPECpower_ssj 2008 1.12
SPECstorage Solution 2020 1.0
SPECvirt Datacenter 2021 1.0
SPEC VIRT_SC 2013 1.0, 1.1