Java Performance Tuning

Java(TM) - see bottom of page

|home |services |training |newsletter |tuning tips |tool reports |articles |resources |about us |site map |contact us |
Tools: | GC log analysers| Multi-tenancy tools| Books| SizeOf| Thread analysers| Heap dump analysers|

Our valued sponsors who help make this site possible
JProfiler: Get rid of your performance problems and memory leaks! 

Training online: Concurrency, Threading, GC, Advanced Java and more ... 

The Interview: Bruce Tate, Bitter Java

Get rid of your performance problems and memory leaks!

Modern Garbage Collection Tuning
Shows tuning flow chart for GC tuning

Java Performance Training Courses
COURSES AVAILABLE NOW. We can provide training courses to handle all your Java performance needs

Java Performance Tuning, 2nd ed
The classic and most comprehensive book on tuning Java

Java Performance Tuning Newsletter
Your source of Java performance news. Subscribe now!
Enter email:

Training online
Threading Essentials course

Get rid of your performance problems and memory leaks!

Back to newsletter 036 contents

This month we have an interview with Bruce Tate, author of Bitter Java and Bitter EJB.

JPT: Can you tell us a bit about yourself and what you do?

Bruce: I founded a company that specializes in Java consulting, education, and services. I focus in the areas of Java persistence, development and performance process, and education. I speak at Java user groups nationwide, and am a featured speaker at the No Fluff, Just Stuff Java Software Symposium series. I also do expert witness work, and have a series of partnerships that I use to build applications from the ground up. At J2Life, our favorite engagements are design reviews. Since budgets are constrained, I find that many customers can?t afford to bring in a high-profile architect. Instead, they can contract for a design review to focus on the riskiest elements of an application. Design reviews are highly popular, and cost effective, because they generate incredible value in a very short time. My most recent design reviews were at an Austin start up that builds Java performance tools (a role which grew into an interim CTO position), and a design review built to jump-start a project for a fortune 100 distribution company. The second one was fun, because it gave me an opportunity to mentor a team that had no Java experience at all. I think they?ll succeed.

JPT: Can you tell us a little bit about the No Fluff, Just Stuff Java Symposium and what your goals are by being involved.

Bruce: I consider the No Fluff symposiums the premiere Java conferences based on value and speaker quality. When Jay invited me to attend as a featured speaker, I gladly accepted. Any time that you?ve got a chance to be associated with some of the top Java developers in the industry (Dave Thomas, Jason Hunter, James Duncan Davidson, Bob Martin, Ted Neward), you?ve got to listen. It dovetails nicely with the goals of my company. In a tight economy, I?ve got to concentrate a third to a half of my activities on marketing. A good example is technical writing. I?m not going to get rich writing books or articles, but like the conference, they give me a platform that I?ve never had before to express my ideas. It?s a logical next step in an evolving career.

JPT: In your latest book, Bitter EJB, you seem to focus on Anti-patterns. Why do you feel that Anti-patterns are important?

Bruce: In Bitter Java and Bitter EJB, I tried to drive home the fact that we, as an industry, need to pay more attention to software that fails in systemic ways. A good programmer knows a handful of design patterns. A great programmer knows these, but also understands when *not* to apply them. Today, many programmers do not hang around long enough to see their applications succeed, or fail, with their customer base. They are also ignorant of common programming traps, that we call antipatterns. As software engineers, we can?t think that way. Look at it this way. We tend to deal with bugs individually. As bugs start to show up in similar circumstances, it would be much more effective to deal with them collectively, by better education, or refactoring a framework, or building some automated tests. If we were in other industries, we?d have to roll up our sleeves and get involved, looking for systemic ways to solve problems. I mean, manufacturers have done it for several decades through programs like Zero Defects and Quality Circles. Surely, we can learn from them?

JPT: It often takes time to understand how something could have been done better and we all know that with the shrinking schedules programmers often don't get the time needed to acquire this understanding. Under these circumstances, how best can a good programmer make the jump to understanding when not to apply a pattern or a coding technique?

Bruce: I love this topic. One of my primary focuses in my business is helping others succeed. I mentor writers, developers, high school students, and other professionals. I also have my own set of mentors. That?s an important part of a career development path. Developers have an opportunity to code together. I think that pair-programming is probably the quickest way to learn. I took my most productive jobs to be a part of a team with great programmers. The ideas that I formed in those areas served me well.

I also think that Jay Zimmerman has the education model right. It?s much easier to be able to break away for a weekend, to take several targeted short sessions on a variety of topics than to take full courses on Struts, Ant, Persistence, XML, and process.

JPT: Which Anti-pattern do you feel has the largest negative impact on performance?

Bruce: I?ve given this problem a whole lot of thought. I think that the way that we deal with performance is pretty much fundamentally flawed. Right now, we do one of two things: we try to make everything super-fast the first time, or we wait until someone screams in production. Both are dangerous and expensive. The first problem is flawed, because developer intuition sucks. We simply guess wrong more than we guess right. Smart people have not been immune, either: the initial models for CORBA and EJB entity beans were fundamentally flawed, because they injected too much communication costs for typical usage models. If you?re guessing, then you?re either building in too much performance (which is incredibly expensive), or you?re missing on your performance goals. And we all know what waiting until production does to our future schedules and well-intentioned designs.

Ideally, we should measure our fundamental performance requirements using JUnit test cases (JUnitPerf, from, is a fantastic start.) But we simply don?t have enough tools to do so today. The ideal tool would be ant-integrated, automated, and require as few code changes as possible.

JPT: Do you see the process of performance tuning start with the requirements gathering? And how many projects do you see actually gathering performance requirements (or expectations) from the customer (or users).

Bruce: In many cases, it does. Most people have a general idea of how fast a system needs to be. They just don?t capture those requirements. For example, when you?re rebuilding an existing system, you might hear, ?The new system has to do this as fast as the old one.? Or, ?This screen is not as critical as the call center. We know what each wasted second costs us in the call center.? Or, ?We?re building an e-commerce site to reach customers outside of our organization.? 

Each one of these statements tells you something. In the case of the last statement, we have an implied performance requirement. We understand that if a user interface doesn?t respond in a short time, the customer?s likely gone. Many use four seconds as a benchmark. The hardest parts are breaking apart simple performance requirements from scalability requirements, and measuring production-like performance on a development system. There?s no silver bullet in this area.

JPT: What is your favorite Anti-pattern.

Bruce: ?Favorite? is relative. That?s like asking me about my favorite disease, no? ;) I like to talk about an antipattern that formed the foundation for Struts called the Magic Servlet. That?s basically 10K of code hanging off of a Submit button. We see many different versions of it: the OnMessage command in JMS, a session bean, a servlet, an XML processor, command processors, request handlers, and the like. It?s fun to talk about, because everyone knows it?s wrong, but we all get caught doing it anyway.

JPT: Which anti-pattern do you run into most often?

Bruce: I tend to see many different versions of the Golden Hammer. I?ve been on a simplicity kick lately. I think that EJB, many persistence frameworks, clustering, many design patterns, and larger frameworks are prone to misuse. I?m glad that people are starting to move back toward the simpler solutions, like command-line development (or a very simple IDE) with Ant and Junit. I also like the movement back to POJO for database work, and even containers (like the Spring-framework).

JPT: What was the most interesting performance problem that you found and how did you solve it?

Bruce: That?s one of my favorite interview questions for a new candidate! I?ve seen a whole lot of problems with roll-your-own persistence frameworks. Usually, you start by looking at the SQL that?s generated. Some frameworks build tiny atomic queries for everything. They are fast, but there are 10,000 of them. You?ve got to be smart about how you roll them up into fewer queries, and delay execution. I?ve suggested to patch some frameworks, and I?ve suggested ripping them out and starting over with POJO, a commercial alternative, or an OpenSource framework. I think that the guys at SolarMetric do a great job with Kodo, and I think Hibernate has an outstanding start with their product as well.

JPT: But we both know that these tiny atomic persistence frameworks start with the idea that O/R mapping is an easy problem. And to be honest, "in the small" it is an easy problem. It's only when you try to scale that these problems show up. Do you try to encourage projects to use persistence frameworks early on in a projects life-cycle and if so, how do you go about encouraging their use given that the need may not be visible?

Bruce: To be honest, I think that persistence frameworks are overused. As many as half of the applications I see exist to simply baby-sit the database: presenting, updating and collecting data. Persistence has always been a tradeoff. When you find yourself spending too much time coding persistence-type code, you probably need to automate. Even then, you can usually use something simple like POJO. But modern persistence alternatives like SolarMetric?s Kodo and Hibernate are surprisingly fast and scalable, especially when a generic caching solution will work for you.

JPT: What change have you seen applied in a project that gained the largest performance improvement?

Bruce: We ditched EJB, in favor of a simple POJO solution.

JPT:  Have you been in situations where using a persistence framework has been a benefit?

Bruce: Absolutely. I?ve worked with a company that developed a custom framework, where we generated the data model, the SQL, and the object model from an XML descriptor. The reference list for the major JDO vendors just keeps growing. Persistence Corp has always had an impressive reference list. People are starting to do some real things with persistence frameworks, and I expect that to continue.

JPT: Thank you Bruce for your time and we wish you all the best in your future endeavors.

(End of interview).

Back to newsletter 036 contents

Last Updated: 2022-06-29
Copyright © 2000-2022 All Rights Reserved.
All trademarks and registered trademarks appearing on are the property of their respective owners.
Java is a trademark or registered trademark of Oracle Corporation in the United States and other countries. is not connected to Oracle Corporation and is not sponsored by Oracle Corporation.
RSS Feed:
Trouble with this page? Please contact us