As the Internet and enterprise wide distributed systems become more prevalent in business IT systems, numerous advanced COTS (commercial off-the-shelf) middleware technologies have appeared on the market. One such leading middleware technology type is Sun's Java 2 Enterprise Edition (J2EE) technology. At present, there is an abundance of J2EE application server implementations in the marketplace with almost no discerning differences. The different product vendors all claim to have implemented the highest performance and most scalable product. The challenge for the potential J2EE application server user is then in choosing the right product for their purpose. In order to select a J2EE application server from the confusing J2EE product market, lots of organizations defined their own test suite and carried out their evaluation of J2EE product using their benchmark applications. Unfortunately evaluating application servers is a costly exercise, and proprietary benchmarks and corresponding results are not often reusable for different organizations. To alleviate this problem, there are emerging industry J2EE benchmarks that aim to produce reusable benchmark results. In this paper, we provide insights into performance benchmarking of J2EE technologies via critically evaluating two such industry J2EE benchmark standards: IBM's Trade2 benchmark application and Sun's ECperf. We firstly aim to share the experience gathered during the empirical experiment in using the benchmarks. These insights obtained and shared will provide assistance in future performance benchmark design, as well as informing the users how to be more discerning when reading published benchmark results.
|Cite as: Zhang, Y., Liu, A. and Qu, W. (2003). Comparing Industry Benchmarks for J2EE Application Server : IBM's Trade2 vs Sun's ECperf. In Proc. Twenty-Sixth Australasian Computer Science Conference (ACSC2003), Adelaide, Australia. CRPIT, 16. Oudshoorn, M. J., Ed. ACS. 199-206. |
(local if available)