SPEC logo

SPECjAppServer2002
Frequently Asked Questions

Version 1.01
Last modified: September 24, 2004

Q1: What is SPECjAppServer2002?
Q2: You just released SPECjAppServer2001 in September, why are you releasing SPECjAppServer2002 so soon?
Q3: How is SPECjAppServer2002 different than SPECjAppServer2001?
Q4: Does this benchmark obsolete SPECjAppServer2001?
Q5: Does this benchmark obsolete SPECjvm98 or SPECjbb2000?
Q6: What is the performance metric for SPECjAppServer2002?
Q7: Where can I find published results for SPECjAppServer2002?
Q8: Who developed SPECjAppServer2002?

Q10: Other SPEC benchmarks do not have a price/performance metric. Can you explain why SPECjAppServer2002 has a price/performance metric?
Q11: Can I compare SPECjAppServer2002 results with SPECjAppServer2001 results?
Q12: Can I compare SPECjAppServer2002 results with ECperf 1.1 results?
Q13: Can I compare SPECjAppServer2002 results with TPC-C results or TPC-W results?
Q14: Can I compare SPECjAppServer2002 results to results from other SPEC benchmarks?
Q15: Can I compare SPECjAppServer2002 results in different categories?
Q16: Do you permit benchmark results to be estimated or extrapolated from existing results?

Q17: What does SPECjAppServer2002 actually test?
Q18: What are the significant influences on the performance of the SPECjAppServer2002 benchmark?
Q19: Does this benchmark aim to stress the J2EE server or the database server?
Q20: Can you describe the workload?
Q21: Can I use SPECjAppServer2002 to determine the size of the server I need?
Q22: What hardware is required to run the benchmark?
Q23: What is the minimum configuration necessary to test this benchmark?
Q24: What additional software is required to run the benchmark?
Q25: Do you provide source code for the benchmark?

Q26: Is there a web layer in the SPECjAppServer2002 benchmark?
Q27: Why did you not address SSL (Secure Socket Layer)?
Q28: Can I report results on a large partitioned system?
Q29: Is the benchmark cluster scalable?
Q30: How scalable is this benchmark?
Q31: Can I report with vendor A hardware, a vendor B J2EE Server, and vendor C database software?
Q32: Can I use the Microsoft SQL Server for the database?
Q33: I am using public domain software, can I report results?
Q34: Are the results independently audited?
Q35: Can I announce my results before they are reviewed by the SPEC Java Subcommittee?

Q36: How can I publish SPECjAppServer2002 results?
Q37: How do I obtain the SPECjAppServer2002 benchmark?
Q38: How much does the SPECjAppServer2002 benchmark cost?
Q39: How much does it cost to publish results?
Q40: What if I have questions about running the SPECjAppServer2002 benchmark?
Q41: Where can I go for more information?


Q1: What is SPECjAppServer2002?

SPECjAppServer2002 is an industry standard benchmark designed to measure the performance of J2EE application servers.

Q2: You just released SPECjAppServer2001 in September, why are you releasing SPECjAppServer2002 so soon?

The two benchmarks (SPECjAppServer2001 and SPECjAppServer2002) are very similar except for the EJB (Enterprise Java Beans) specification used. SPECjAppServer2001 adheres to the EJB 1.1 specification while SPECjAppServer2002 adheres to the EJB 2.0 specification. While the EJB 2.0 specification is complete, there are vendors who are not able to publish benchmark results using the EJB 2.0 specification yet. There are also vendors who will not be able to publish benchmark results using the EJB 1.1 specification. To allow vendors in either situation to publish results it was decided to release two benchmarks, one supporting each specification. The reason that the two benchmarks are incomparable is because there are different optimization opportunities and constraints in the two EJB specifications.

Q3: How is SPECjAppServer2002 different than SPECjAppServer2001?

SPECjAppServer2001 adheres to the EJB 1.1 specification while SPECjAppServer2002 adheres to the EJB 2.0 specification. There are four main differences in the implementation of the SPECjAppServer2002 benchmark:

  1. SPECjAppServer2002 has been converted to use the EJB 2.0 style CMP (Container Managed Persistence) entity beans.
  2. SPECjAppServer2002 takes advantage of the local interface features in EJB 2.0.
  3. SPECjAppServer2002 utilizes CMR (Container Managed Relationships) between the entity beans.
  4. SPECjAppServer2002 uses EJB-QL in the deployment descriptors.

For more details, see Appendix B of the SPECjAppServer2002 Design Document.

Q4: Does this benchmark obsolete SPECjAppServer2001?

No. SPECjAppServer2001 adheres to the EJB 1.1 specification while SPECjAppServer2002 adheres to the EJB 2.0 specification. As both specifications are currently used, both versions of the benchmark provide interesting information and will be supported until the release of the SPECjAppServer2003 benchmark.

Q5: Does this benchmark obsolete SPECjvm98 or SPECjbb2000?

No. SPECjvm98 is a client JVM benchmark. SPECjbb2000 is a server JVM benchmark. SPECjAppServer2002 is a J2EE Application Server benchmark.

Q6: What is the performance metric for SPECjAppServer2002?

SPECjAppServer2002 expresses performance in terms of two metrics:

Q7: Where can I find published results for SPECjAppServer2002?

SPECjAppServer2002 results are available via SPEC’s Web site: http://www.spec.org/.

Q8: Who developed SPECjAppServer2002?

SPECjAppServer2002 was developed by the SPEC OSG Java subcommittee as a port of the SPECjAppServer2001 benchmark to the EJB 2.0 specification.

For information on who originally developed the SPECjAppServer2001 benchmark, see the SPECjAppServer2001 FAQ.


Q10: Other SPEC benchmarks do not have a price/performance metric. Can you explain why SPECjAppServer2002 has a price/performance metric?

The lineage of SPECjAppServer2002 is ECperf which was developed under the JCP process. SPEC committees debated on the inclusion of this metric for the SPECjAppServer2001 and SPECjAppServer2002 benchmark. When the SPECjAppServer2001 benchmark was released SPEC decided to do this on an experimental basis, and that this experiment would expire at the conclusion of the review cycle to end on 05/03/2003.

05/23/03: At a SPEC OSSC meeting on 04/08/2003, the OSSC voted that the SPECjAppServer2002 benchmark would not be automatically retired on 05/03/2003, rather the benchmark should continue until 6 months after the release of the follow-on benchmark (SPECjAppServer2003).

09/24/04: The OSSC voted to extend the expiration date of the benchmark; submissions will be accepted for review and publication until January 26, 2005.

Q11: Can I compare SPECjAppServer2002 results with SPECjAppServer2001 results?

No. The two benchmarks are incomparable because there are different optimization opportunities and constraints in the two EJB specifications (SPECjAppServer2001 adheres to the EJB 1.1 specification while SPECjAppServer2002 adheres to the EJB 2.0 specification).

Q12: Can I compare SPECjAppServer2002 results with ECperf 1.1 results?

No, for two reasons. One, ECperf 1.1 results cannot be announced publicly. Two, while SPECjAppServer2002 may use the same workload as ECperf 1.1 it uses different code for the application server, so a direct comparison of results is not appropriate.

Q13: Can I compare SPECjAppServer2002 results with TPC-C results or TPC-W results?

No, absolutely not. SPECjAppServer2002 uses totally different data-set sizes and workload mixes, has a different set of run and reporting rules, a different measure of throughput, and different metrics.

Q14: Can I compare SPECjAppServer2002 results to results from other SPEC benchmarks?

No. There is no logical way to translate results from one benchmark to another.

Q15: Can I compare SPECjAppServer2002 results in different categories?

No. The Centralized categories (Single Node System, Dual Node System, and Multiple Node System) were established to prevent comparisons between dissimilar hardware configurations. The Distributed category has different performance characteristics than the Centralized categories because it uses multiple resource managers.

Q16: Do you permit benchmark results to be estimated or extrapolated from existing results?

No.


Q17: What does SPECjAppServer2002 actually test?

SPECjAppServer2002 mainly tests the Enterprise JavaBeans (EJB) container in a J2EE 1.3 compatible server. It does not exercise all components of J2EE 1.3. See section 1.1 of the SPECjAppServer2002 Design Document for more information.

Q18: What are the significant influences on the performance of the SPECjAppServer2002 benchmark?

The most significant influences on the performance of the benchmark are:

Q19: Does this benchmark aim to stress the J2EE server or the database server?

This benchmark was designed to stress the J2EE server. However, as this is a solutions based benchmark other components (such as the database server) are stressed as well.

Q20: Can you describe the workload?

The benchmark emulates a manufacturing, supply chain management (SCM) and order/inventory system. For additional details see the SPECjAppServer2002 Design Document.

Q21: Can I use SPECjAppServer2002 to determine the size of the server I need?

SPECjAppServer2002 should not be used to size a J2EE 1.3 server configuration, because it is based on a specific workload. There are numerous assumptions made about the workload, which may or may not apply to other user applications. SPECjAppServer2002 is a tool that provides a level playing field for comparing J2EE 1.3 compatible server products. Users of the tool can use the benchmark for internal stress testing, with the understanding that the test results are for internal use only.

Q22: What hardware is required to run the benchmark?

In addition to the hardware for the System Under Test (SUT), one or more client machines are required as well as the network equipment to connect the clients to the SUT. The number and size of client machines required by the benchmark will depend on the injection rate to be applied to the workload.

Q23: What is the minimum configuration necessary to test this benchmark?

A member of SPEC has run the benchmark on a Pentium III 1GHz laptop system with 1024MB of RAM and a 30GB hard drive. The benchmark completed successfully with an injection rate of 5.

Note: This is not a configuration that you can use to report results, as it does not meet the durability requirements of the benchmark.

Q24: What additional software is required to run the benchmark?

SPECjAppServer2002 requires a J2EE 1.3 compatible server as well as a database server. See section 2.2 in the SPECjAppServer2002 Run and Reporting Rules for details on all the products. Also, a Java Runtime Environment (JRE) version 1.3 or later must be installed on the client machines.

Q25: Do you provide source code for the benchmark?

Yes, but you are required to run with the files provided with the benchmark if you are publishing results. As a general rule, modifying the source code is not allowed. Specific items (for example, the Load Program) can be modified to port the application to your environment. Areas where it is allowed to make changes are listed in the SPECjAppServer2002 Run and Reporting Rules. Any changes made must be disclosed in the submission file when submitting results.


Q26: Is there a web layer in the SPECjAppServer2002 benchmark?

No. We will be adding a web layer in the SPECjAppServer2003 benchmark.

Q27: Why did you not address SSL (Secure Socket Layer)?

SSL is addressed in the SPECweb99_SSL benchmark.

Q28: Can I report results on a large partitioned system?

Yes.

Q29: Is the benchmark cluster scalable?

Yes.

Q30: How scalable is this benchmark?

In our initial tests we have seen good scalability with three 4-CPU systems (two systems for the J2EE Server and one system for the database server) and we did not explicitly restrict scalability in the benchmark.

Q31: Can I report with vendor A hardware, a vendor B J2EE Server, and vendor C database software?

The SPECjAppServer2002 Run and Reporting Rules do not preclude 3rd party submission of benchmark results, but result submitters must abide by the licensing restrictions of all the products used in the benchmark; SPEC is not responsible for vendor (hardware or software) licensing issues. Many products include a restriction on publishing benchmark results without the expressed written permission of the vendor.

Q32: Can I use the Microsoft SQL Server for the database?

Yes. You can use any relational database that is accessible by JDBC and satisfies the SPECjAppServer2002 Run and Reporting Rules.

Q33: I am using public domain software, can I report results?

Yes, as long as the product satisfies the SPECjAppServer2002 Run and Reporting Rules.

Q34: Are the results independently audited?

No.

Q35: Can I announce my results before they are reviewed by the SPEC Java Subcommittee?

No.


Q36: How can I publish SPECjAppServer2002 results?

Only SPECjAppServer2002 licensees can publish results. All results are subject to a review by SPEC prior to publication.

For more information about submitting results, please contact SPEC.

Q37: How do I obtain the SPECjAppServer2002 benchmark?

To place an order, use the on-line order form or contact SPEC at http://www.spec.org/spec/contact.html.

Q38: How much does the SPECjAppServer2002 benchmark cost?

Current pricing for all the SPEC benchmarks is available from the SPEC on-line order form. SPEC members receive the benchmark at no charge.

Q39: How much does it cost to publish results?

Contact SPEC at http://www.spec.org/spec/contact.html to learn the current cost to publish SPECjAppServer2002 results. SPEC members can submit results free of charge.

Q40: What if I have questions about running the SPECjAppServer2002 benchmark?

The procedures for installing and running the benchmark are contained in the SPECjAppServer2002 User Guide, which is included in the kit and is also available from the SPEC web site.

Q41: Where can I go for more information?

SPECjAppServer2002 documentation consists mainly of four documents: User Guide, Design Document, Run and Reporting Rules, and this FAQ. The documents can be found in the benchmark kit or on SPEC’s Web site: http://www.spec.org/.


Java, J2EE and ECperf are trademarks of Sun Microsystems, Inc.

TPC-C and TPC-W are trademarks of the Transaction Processing Performance Council.

SQL Server is a trademark of Microsoft Corp.