Java theory and practice: Performance management -- do you have a plan?

Knowing when to optimize is more important than knowing how to optimize

Where do performance problems come from? There are many types of programming choices that can lead to performance problems -- inefficient algorithms, redundant computation, poor resource allocation and usage, excessive synchronization, or just plain inefficient design. But more prevalent -- and damaging -- are mistakes of management and approach rather than programming. In this installment of Java theory and practice, Brian Goetz discusses some of the most common performance mistakes he's seen in projects using the Java language.

Brian Goetz (brian@quiotix.com), Principal Consultant, Quiotix Corp

Brian Goetz is a software consultant and has been a professional software developer for the past 15 years. He is a Principal Consultant at Quiotix, a software development and consulting firm located in Los Altos, California. See Brian's published and upcoming articles in popular industry publications.



25 March 2003

Also available in Russian Japanese

Performance management is often considered a black art, because performance problems often don't emerge until the application is completely developed. At that point, it can be very hard to identify their source. Once the cause of a performance problem is sufficiently well identified, however, it is often relatively straightforward to fix. Engineers are generally quite creative (sometimes too creative) about finding more efficient ways to perform a particular task. For any given performance issue, the solution might be as simple as replacing an algorithm with a more efficient one, using caching to reduce redundant computation, or just adding more hardware. But it can be difficult to clearly identify the source (or sources) of a performance problem, and even more difficult to design sophisticated programs so that they don't have performance problems in the first place.

While programming decisions -- such as a sub-optimal choice of algorithm or data representation, failure to reuse the results of previous calculations, or poor resource management -- are usually deemed to be the proximate cause of a performance problem, most performance problems also have a deeper cause: the failure to integrate performance management, goals, and measurement into the development process in the first place.

Problem? What problem?

How do you even know when you have a performance problem? For most development teams, the answer (to paraphrase U.S. Supreme Court Justice Potter Stewart's characterization of obscenity) is: we know it when we see it. And this is the heart of the problem -- performance goals, metrics, and measurements are often not considered until it's too late.

The most common performance management strategy is . . . nothing, and it usually takes one of two forms:

  • Ignoring performance entirely until application development is complete
  • Optimizing as you go, which usually means focusing only on micro-performance considerations and ignoring the bigger picture

Both of these strategies share the same underlying problem -- they do not treat performance management as an integrated part of the development process.

Flawed performance strategy A: Ignore performance entirely

The first approach, ignoring performance completely, treats performance as something that can be handled at the end of the project, like writing the release notes or building the installer. Basically, this strategy relies on luck, and because computers get faster and faster every year, it works just often enough to keep getting used.

The problem with relying on luck, even when your odds are pretty good, is that when you have a performance problem, you have no framework for approaching it, identifying its source, or addressing it in anything but an ad hoc manner. Nor have you scheduled time on your development plan for performance measurement and tuning. It's kind of like a site that has made no attempt to have a coherent security policy except for having installed a firewall with the default configuration, and then discovering they've been hacked. Where do you start?

Flawed performance strategy B: Optimize as you go

The other common -- but even worse -- approach is to let micro-performance considerations drive architectural and design decisions. Developers love to optimize code, and with good reason -- it is satisfying and fun. But knowing when to optimize, both in terms of what code to focus on and when in the development cycle to address performance issues, is far more important. Unfortunately, developers generally have horrible intuition about where the performance problems in an application will actually be. As a result, they waste effort optimizing infrequently executed code paths, or worse, they compromise good design and development practices to optimize a component that doesn't have any performance problems in the first place. When you've got your head down in the code, it's easy to miss the performance forest for the trees.

Making each individual code path as fast as it can possibly be is no guarantee that the final product is going to perform well. No amount of local optimization will make up for a fundamentally inefficient design, even if every component is implemented to be as fast as possible. The optimize-as-you-go strategy substitutes a focus on low-level performance considerations for having a performance strategy for the project as a whole, with the added detriment of convincing yourself that you actually have a performance strategy.

One of the many problems of optimizing as you go is that it ignores the inherent risks in optimization. There are a few optimizations that also happen to result in better design and fewer bugs, but these are the exceptions. For the most part, optimizations involve trade-offs between performance and other considerations like clean design, clarity, flexibility, and functionality. Optimization has costs and risks: it could introduce bugs, limit the functionality of the code, or make it harder to use or maintain. Before you incur these costs, make sure there's a benefit worth having.


Make performance management part of your development process

Performance measurement and planning should be integrated into the process from the beginning, with separate, interleaved iterations for development and performance measurement and tuning. This means setting performance targets and goals, having a performance measurement plan in place, and frequently reviewing the performance of the code as it is developed. Keep your old test results, preferably in a database, so you can easily compare how performance changes as you change the code.

Having separate iterations for development and performance allows you to focus on writing functional, bug-free code during the development iterations, knowing that you will have an adequate opportunity to improve performance soon enough, if necessary. If you think of a clever trick to make your code faster, put a comment in the code detailing your idea, but don't implement it now! This is not the time for optimization. Return to it when you are focusing on performance, if it turns out to be necessary. Performance optimization should be driven by performance goals and supported by performance measurement. Anything else is just "playing."

Measure twice, and then some

Measurement is a critical element of performance management. Think a given tweak will make the code run faster? Be prepared to prove it. Use your performance measurement tools to test the performance before and after the tweak. What if you can't measure an improvement? Then be prepared to back your tweak out. Why risk breaking code that works if you can't measure a benefit?

During performance iterations, measure the performance of the application or its components, and compare them with measurements from previous iterations. Did something get slower? Figure out why. You don't necessarily have to change it if it doesn't fall short of your performance goals, but now you've gained valuable feedback about the performance impact of your changes.

Got goals?

If you don't have quantitative performance goals and a measurement plan to support them, performance tuning is almost pointless. How do you know when you're done? Other phases of development, such as coding, testing, and packaging, have defined goals -- implement this set of features, fix these bugs, and so on. The performance phase should have structure and goals, too.

It is particularly important to have performance goals when performance concerns are imposed externally, whether from a customer or from another department within the company. When someone tells you to make the program faster, you should first ask, "How much faster do I have to make it?" Otherwise, you might invest more resources than necessary in tuning and still not make the customer happy. It's very frustrating to put in a substantial effort to make your program run 30 percent faster, only to be met with a response of "Gee, I was hoping for something like 50 percent faster."


Summary

Performance management is much more than optimization. It's having a framework for deciding when -- and when not -- to optimize. You should make these decisions in light of explicit performance goals, measurement, and planning, rather than on the basis of intuition.

Resources

  • Jack Shirazi's book Java Performance Tuning (O'Reilly & Associates, 2003) contains a wealth of advice not only on individual optimizations, but how to measure performance and approach performance tuning intelligently.
  • Jack Shirazi's Java Performance Tuning Web site contains links to many performance-related articles and tips.
  • Java Platform Performance: Strategies and Tactics (Addison-Wesley, 2000), by Steve Wilson and Jeff Kesselman, contains some in-depth case studies of actual performance issues encountered during the development of the Java class libraries.
  • The three-part "Design for Performance" series, also by Brian Goetz (JavaWorld), offers some tips on how design decisions can affect performance.
  • Find hundreds more Java technology resources on the developerWorks Java technology zone.

Comments

developerWorks: Sign in

Required fields are indicated with an asterisk (*).


Need an IBM ID?
Forgot your IBM ID?


Forgot your password?
Change your password

By clicking Submit, you agree to the developerWorks terms of use.

 


The first time you sign into developerWorks, a profile is created for you. Information in your profile (your name, country/region, and company name) is displayed to the public and will accompany any content you post, unless you opt to hide your company name. You may update your IBM account at any time.

All information submitted is secure.

Choose your display name



The first time you sign in to developerWorks, a profile is created for you, so you need to choose a display name. Your display name accompanies the content you post on developerWorks.

Please choose a display name between 3-31 characters. Your display name must be unique in the developerWorks community and should not be your email address for privacy reasons.

Required fields are indicated with an asterisk (*).

(Must be between 3 – 31 characters.)

By clicking Submit, you agree to the developerWorks terms of use.

 


All information submitted is secure.

Dig deeper into Java technology on developerWorks


static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Java technology
ArticleID=10789
ArticleTitle=Java theory and practice: Performance management -- do you have a plan?
publish-date=03252003