The charts shown in the Java document-model performance article use the best time measured for each test. The test program also computes average times, but the average times tend to vary a lot more than the best times depending on the documents used, the number of test passes, and the order of the documents.
The two figures below show how the average times for one test run compared with the best time results for the same test run. The average time values use the default test program settings, which exclude the first pass of each test on each document. The first pass time is usually much higher than the average time, due to HotSpot optimization behavior.
Figure 1. Average document build time
Figure 2. Best document build time
Note that the scales differ in these charts, with the average times generally about 20% to 30% higher than the best times. The relative performance of the different models is about the same, though, no matter whether average or best time is used for the comparison.