|
|
|
Back to newsletter 034 contents
This month we interviewed Frank Cohen, creator of the open source TestMaker web application load tester, aimed at Web Service load testing.
JPT: Can you tell us a bit about yourself and what you do?
I'm the "go to" guy for enterprises that need to understand and solve scalability problems in their information systems, especially in Web Services. I founded PushToTest in 2001 to provide services to enterprises and to be the center of the TestMaker community. TestMaker is an open-source test tool I created in 1997 and continue to maintain that checks Web-enabled applications for scalability, functionality, and performance. The TestMaker community is now 42,000 strong and we regularly send newsletters and surveys to our 3,200 registered users.
JPT: What do you consider to be the biggest Java performance issue currently?
The biggest Java performance issue currently is the uncertainty of how Web Services will impact overall system performance. In my view, J2EE 1.4 is primarily a Web Service focused release. It tells the J2EE using audience to expect XML encoded data in everything they do - Application Programming Interfaces (APIs), communication protocols, Web page flows, security protocols, and data persistence. XML is neither compact nor concise. So we can expect to see Web infrastructures (TPC/IP routed networks moving HTTP traffic containing XML-encoded data) to acheive new levels of usage. The CPU bandwidth needed to support XML data serializers and the TCP/IP networks will become the new bottlenecks to good performance. By 2005, with wide adoptance of second generation Web Service standards (WS-Security, WS-Reliability, SAML, WSDL 1.1, etc.) we will see performance issues move up into the stack as complex XML datatypes need to be encoded and decoded in the CPU.
The existing performance enhancement tools and debuggers will need to be extended to show how a J2EE 1.4 or greater application performs under load. And yet, these debuggers have no facility to create the same load a real user creates in a production setting. This paradox will become the biggest Java performance issue in the near future.
JPT: Do you know of any requirements that potentially affect performance significantly enough that you find yourself altering designs to handle them?
In Web-enabled applications that require synchronization of data that comes from multiple sources, I have had to alter system designs to provide for data replication from one data center to another. For example, in my last start-up company, Inclusion Technologies, we had to change our original designs for a secure collaborative extranet application server to use a persistent message store-and-forward protocol, rather than building the system around a single database server.
It was very popular over the past 5 years to think a Java Web-enabled application would instantly scale by putting a load balancer between the end-user and a Web application server. Today, clustering technology is important for large systems. Clusters can share data caches and session identification markers.
JPT: What are the most common performance related mistakes that you have seen projects make when developing Java applications?
Most of the performance related mistakes I see in Java code comes when PushToTest conducts a datacenter certification. This is where we certify that the customer has enough servers, software, storage and network bandwidth to handle a defined number of users - for example, the University with 12,000 students returning from their summer break that installed a new email server. What I often find in Java code is concurrency problems - the code works fine with a few users but breaks with many users. The problem is usually related to unnecessary global variables that are never released and consequently never garbage collected, threads that never terminate even after their workflow is completed, and thread deadlocks that happen after a dependent thread - such as a log handler - goes into a busy state and never returns.
JPT: Which change have you seen applied in a project that gained the largest performance improvement?
The largest performance improvement I found recently was in SOAP encoded styles in a Java Web service. By changing a Java Web service to use Document-literal encoding from SOAP RPC encoding, the application ran 3100% faster. See: http://dev2dev.bea.com/products/wlworkshop/articles/Cohen.jsp for details and sample Java code.
JPT: Have you found any Java performance benchmarks useful?
No. I am still waiting for a benchmark that models real world usage against a user's goals. In the real world, users of a J2EE application use Web pages, email messages, database services and more (all the APIs that come in J2EE,) yet I don't see a benchmark that uses all these APIs and shows how well the system performed against the user's goals.
JPT: Do you know of any performance related tools that you would like to share with our readers?
My own open-source test tool (TestMaker, found at http://www.pushtotest.com/ptt) would be nothing without the support of many other tools and libraries, including Jython (the Python language implemented 100% in Java,) JDOM (the ultimate Java-friendly way to work with XML data, and also JSR 104,) NetBeans (the IDE and application development framework from Sun,) MaxQ (a proxy recorder that watches you use a browser to operate a Web application and writes a Jython-based test script for you,) and JOpenChart (a cool Java package to visualize the test results in a set of line and bar charts.) All of these are integrated into TestMaker.
JPT: Do you have any particular performance tips you would like to tell our readers about?
I have lots of tips that appear in my upcoming book from Prentice Hall. Advance chapters are available for download at http://www.pushtotest.com/ptt/thebook.html.
JPT: Thank you for the interview.
Thanks for the opportunity to answer your questions. -Frank
(End of interview).
Back to newsletter 034 contents