Dojo (HowTo)







  Easter Eggs




  Martial Arts

iMac Performance
Benchmarking and performance

By:David K. Every
©Copyright 1999

PC Magazine responded to this article at so I respond to their response at iMac Performance Redux.

There have been a few articles of late, bashing Apple's claim of superior performance over PC's. Like these two:

For years (like 12) Apple and Macs just crushed the PC in real world performance (Application Benchmarks). The Macs I/O was faster and the OS much cleaner/tighter and Application comparisons showed the Mac beating PC's in almost every category. But the processor alone usually wasn't faster (at least according to tainted benchmarks). The press, and most PC bigots, kept saying that benchmarks are what matter and to ignore the Application tests -- or at least that is what they kept doing. It was quite irritating.

Now, the Mac is stomping the PC's in processor benchmarks, and suddenly the press gets it -- NOW it is time to look at Application comparisons, and stop looking at the benchmarks. While I don't disagree, I think the timing is suspicious, as are many of their tests -- they seem to want to spin things in the way that makes the Mac look the worst. Let's look into their claims.

Is it true?

Many people have been asking me, "are these articles true?". The answer is "yes"... and "no".

When you look at just the processor alone, and processor intensive Applications, or parts of Applications, there is no doubt that Apple and Byte's claims of PowerPC performance superiority are true. The PowerPC processor is definitely faster than Pentiums -- by a lot. PC's are stomped. Apple's claims are valid.

But computers are complex Systems, with many variables and parts. Tuning these Systems is far more complex that just sticking in a fast processor. Lets go back to car analogies -- if you put a big motor in a big car, it will go slower than a big motor in a small/light car. Not only that, a powerful car needs a good transmission and suspension to support it. Like cars, there are many subsystems in a computer that need to go with the fast processor.

In some ways, in the last few years, the PC's have enjoyed an edge in some of those other subsystems (or certain parts of them). So while Apple's claims that they have a better and more reliable motor are true, others can claim that their cars go from 0 to 60 MPH faster can be true as well. What we need to ask ourselves is what matters to us?

What matters in performance?

When you are measuring usable performance, Application tests are what matter. What are you going to be doing? Is the computer going to be faster for you, or not? Who cares if all the subsystems you aren't stressing are faster or not -- if you are not using them, then it doesn't matter. So doing Application tests is the better way to measure performance for individuals.

The problem is that it is very hard to objectively measure these things -- and very easy to taint the results.

For example, Microsoft Macintosh Apps still have some "Windows emulation" code stuffed in there. That emulator may be PowerPC native, but to the best of my knowledge it is still in there. It still has to remap Windows calls into Mac calls and is overhead. This remapper is an extra layer -- and MS has been notorious for slow Mac Applications for a decade. So while the new version of Office for Mac is far better than older versions, comparing it on Mac to PC is unfair to Mac, since it can only prove Microsoft's incompetence at writing Mac Apps -- and it does not prove that the Mac is faster or slower. If you compare the same exact functions, in another application (like Claris Works), you can get dramatically different results. In fact, ClarisWorks seems to prove the hypothesis that it is Microsoft that has the problem (and the Mac is superior). Yet, in the real world, if all that you are going to be doing is Microsoft Office, then those real world numbers ARE things you should look at.

I run ClarisWorks or FrameMaker (for Documents). Comparing Office's performance is decidedly unrepresentative of the real world, for me. For others it may be valid. But these Magazines claims that the iMac is NOT as fast as Apple's claims are just as deceptive as Apple's claims. In fact Claris Works tends to support that the Mac is superior -- and since it is what I use (and the software that comes with the machine), it is more relevant than office (to me).

In fact, if you compare ClarisWorks (AppleWorks) on the Mac to Office on the PC, you get dramatically different performance curve, where the Mac is far far faster. Since that is how many people work on their respective platforms, it may be more representative of true performance than comparing ClarisWorks to ClarisWorks or Office to Office.

The other thing is that you have to remember that there are things you care about performance on and things that you do not. Having my machine scroll a 50 page document in 10 seconds or in 15 seconds is utterly and completely useless. I don't do that. I use page-down or other tricks if I need to go to the bottom of a document. So while this test may show how fast the graphics chip is, it is not close to representing how I will use the machine. Furthermore, I do not care about. 5 seconds on a scroll since it is not a significant amount of time.

Not only is scroll speed pretty irrelevant, and a measurement of video performance and not processor performance, but there are cases where scroll speed should be slowed down. In fact, Apple has scroll speed limiting for many parts of the Mac OS (and it could have been involved in some of these tests). Remember, if your screen scrolls too fast, you can't see what is going by, and it is hard to hit your target. So Apple intentionally limits speed on smaller windows.

Processor performance may be far more significant. For example, if it takes 5 minutes to do a Photoshop filter on a Mac, versus 10 minutes on a PC, that is significant. I do compiles at work -- to build an Application requires 20-30 minutes on an iMac, and 3-4 hours on a fast PC. So there are areas where processor performance is more likely to matter (to me) -- and those areas are where Mac is more likely to have a big advantage. But that isn't everyone usage.

Yet even there, there are many ways to "taint" the study. For example MMX optimizes a small fraction of Photoshop Filters, in a small subset of sizes and options. If you are using those, the Pentium can appear to be far faster than it really is. If you are ONLY using those filters, in those sizes, with those options, then those tests are valid. But I don't think that is the normal case -- which is why I get annoyed at many of those specialized photoshop "comparisons." In any case, I can probably sit down on two test machines, and inside of 20 minutes, have a test suite that shows the iMac as being FAR FAR faster than the PC (or the exact opposite of what some testers claim).

Conclusion -- What matters over all?

I would value these hard hitting comparisons much more, if these authors, and the media at large, had been so diligent at investigating Intel and the PC's industries many fraudulent and deceptive claims over the last decade (or more). They seem to have a double standard -- Apple's claims have to be gone over with a fine-toothed comb, and picked apart pedantically, and shown the ways that they are wrong (and minimize the ways that they are right) -- yet PC manufacturers don't deserve the same scrutiny. So while these comparisons may not be technically wrong, they are not being fair or honest either.

Performance is very specific to what you are doing. The only comparisons that matter are for the things you will be doing. Even then, unless you are doing things in which the computer is really slowing you down, then performance probably does not matter anyway, and it is just a penis measuring contest.

In the end, the iMac certainly performed as well as machines that often cost far more (not including the dramatically better installation, usability, maintenance and support costs for an iMac). If there was any performance advantage on PC's, think of how many 5 or 10 seconds faster (here and there) it would take, to make up for the difference in 30 minutes of setup time alone. How about the first time you go to visit configuration hell on a PC -- how many years (decades, centuries) of "faster day-to-day" use would that would cost you? Even if PC's were faster (which I don't buy), they are still not the better bargain. I suspect that these authors are focusing on speed and benchmarking as a way to avoid looking at the many benefits of the iMac.

Then there is usability. You can read my entire Interface section for opinions on that. Often the PC makes me do things that I shouldn't have to (me adapting to the computer, instead of the other way). Often the PC slows me down by HOW it works, not only the speed at which it works at. Those are not mentioned in these articles for good reason -- they are biased pro-PC rags that are trying to get advertising dollars, and want to keep PC buyers convinced that they made the right decision -- whether they did or not. Fortunately many new users are buying iMacs, and "get it", even if these writers do not.

What matters to me is not just speed or how much horsepower a car has. Apple's claims that the Macs have more horsepower is valid. But that is only one variable on what car/computer we should buy. There are certainly subjective "handling" characteristics that can confuse matters. I could put a V8 in a Pinto (as I helped a buddy do in high-school ), and I might THINK I have the best car on the planet -- but that doesn't make it so. That high maintenance, and high risk (self-detonating) vehicle is not the choice I would make as a wizened adult (even if it was fun as a tinkering kid). PC's are often good at fast, bad at everything else. So in some areas, and some tests, I do not doubt that the PC will come across better -- even in some real world results. But does that matter to me? I care about many aspects of handling, styling, reliability, safety and so on. That is why I own a Mac -- and would recommend an iMac over an 800 MHz PentiumII (if one were available).

PC Magazines tests

I found many of their tests to be tainted towards things the PC was better at. So much so, that I find it suspiciously like INTENTIONAL biasing.


They did two benchmarks -- search and scroll.

  • Search -- not sure how Adobe does it (to know what it should be tainted towards), but it should probably be a fair benchmark. (And the Mac lost). But it is not a function I do very often, nor do I care about 3-4 seconds difference.
  • Scroll -- completely based on the video chips. I bet you could use 10 different PC's, and get 10 different results. I find the Macs scroll performance zippy, so a little faster would be irrelevant. This has nothing to do with processor.

So where are the dozens of other tests that would make this a valid measurement of this Application?


An Application I use a lot. This one matters to me. While another "too small to be representative" type of test, the Mac dud crush the PC in a spell check -- being 3 times (and 13 seconds) faster in a spell check. And comparable in search and replace.


Also something I use, and showed the Mac comparable to either PC in performance. Or in other words, a 233 was as fast as a 333 or a 400, even in the real world, on things I would care about. Of course this test too was tainted towards I/O-specific things (like import and exports) -- and may have done far better in other areas. And as always, the tests were too few to be statistically, or realistically valid.

Excel / Word

As I mentioned before -- I don't use them. The samplings were small, and specifically tainted towards PC things (I suspect). Yet the difference would have been large enough to make a difference. As I also mentioned -- I think this supports the claim that Microsoft can't write good code (especially Mac code).


I've done Photoshop tests, and usually the Macs stomp the PCs. There are exceptions... that this demo only chose to show 3 exceptions (to what I consider the rule) tends to show how biased this magazine is. Again, give me 20 minutes, and I bet I could show you 3 filters that would prove the exact opposite.


Games are such a trick. I'd really want to look at the specifics to see what goes on. I know a friend and I were setting up Diablo on a Mac and a PC to play a network game at work. After 5 tries on various PCs, we couldn't get any of them to use the network (or sound cards) to the point where we could play. The Mac worked first try. We duplicated the Mac CD (which is not always a copyright violation, depending on how you use it), and we installed it on another Mac, and were playing in no time. Did they compare the 5 minutes versus 1 hour of setup time? I play many games on both my machines (I have a Mac and a PC -- but the PC is newer, and I have a higher end card in the PC), I find that my Mac is not only easier to install, but smoother and better. I haven't turned my PC on in months.


There was a time (before JIT compilers came out), where Apple was just 3-4 times faster than PCs at Java. Then people in the PC world spent a ton of money on optimizing JITs for Java (Just-In-Time compilers, or DR compilers). Apple didn't spend as much, and the PC leaped ahead in performance. Of course most of the PC Javas became notoriously buggy for a long time, and many are nonstandard (like Microsoft's). Apple spent the time to do it right. They are releasing it with, or immediately after, System 8.5 (in another month or two). From things I've seen it is as fast, or faster, than the PC versions. Strategically timing WHEN you do what benchmarks is an interesting way to taint results. While it is true that PCs are faster TODAY at Java, it is not true that they always have been, or always will be. Nor is it true that the PC is the best platform to run Java (reliability and so on).


This article made PC Magazine's weak and biased comparison look like hard science. The most obvious point was that they compared the wrong machine. A PowerMac G3/233 is quite a bit slower than the iMac, having to do with a different bus speed, I/O chip set, and the iMac keeps the ROM in RAM (which makes it like 10-15% faster, at least).

But there are some grains of wisdom in the article. Like that NSTL testing said, "Apple's G3 processor was faster overall, but trailed the PentiumII's in SOME application tests". I agree. Then they asked "Is speed more important than usability" -- no! That is why I will use a Mac. The problem is that they tried to imply that the PC's were more usable, and the tone seemed to imply a big fraud on Apple's part (and ignored the fraud that PC makers have been doing for years).

This whole article jumped around. Throwing in a comparison of the AMD processor (and not the Pentium). And then they said it was only 10-30% faster at the same clock rate (something I don't buy, as I've seen far different results on my own). But then they say that "'G3-based PowerMacs' performance, from the user's perspective, is not fantastically superior to that of the comparable Pentium II-based systems". OK, fine. So it is superior, just not fantastically so. I can live with that.

The conclusion was still pretty shoddy, done with little backup, and no way to prove or disprove their claims. It came across more as PR to appease their PC advertisers than any hard comparison.

There was a follow-up, where the author explains how they did not intentionally defraud the numbers (by comparing video performance, and implying that it was processor performance), and I understand and accept the mistake and clarification. The point is that most of these authors don't seem to understand hard-science or studies, and don't seem to care. It takes way too much time to do it right -- so they do shoddy work (but as good as the others), and will always get the conclusions they want.

Created: 08/29/98
Updated: 11/09/02

Top of page

Top of Section