Dojo (HowTo)







  Easter Eggs




  Martial Arts

Innovation: Design and Programming
Who innovated what

By:David K. Every
©Copyright 1999

One of the most innovating parts of the Macintosh (in 1984) was the design. It was a paradigm shift, and really opened my eyes and forced to me rethink everything I knew and understood about computers. This is what made the Mac so different. For years I couldn't explain all the changes -- but the more I thought about why it was different, the more I understood about computers and design in general.


There was some Object Oriented design done way back, and the potential was understood in some academic circles -- but not by most programmers. Xerox brought together a lot of researchers at PARC to combine their research into various areas of computers and further them (they borrowed, and they furthered a lot -- that's innovation). This included Networking, User Interface, and OOD. They pushed all quite a bit, and integrated a bit -- but most of what these researchers did was one-up demos and studies and papers -- not really an Operating System or commercial quality anything. They really were into the research, and the philosophy was that of academics. Which isn't bad, but isn't the same as a production environment where you have to work with the pragmatics of a product, the realities of users,and all the work arounds and spit a polish required to make a product.

This is why, when Apple started working on the Lisa and the Mac, so many researchers jumped ship and took the opportunity to create. Steve Jobs used to say, "Real Artists Ship [product]!" It wasn't about just theory, it was about finishing it off and making it useful. Well Object Oriented Design was one area that Apple furthered -- and it influenced the Mac a lot.

Object Oriented

The basic concepts of Object Oriented Design can be done without an Object Oriented Language -- the Language just makes it easier to "follow the rules", or I should say it is harder to break them. And there is Object-Like Languages (like ADA) which are sort of hybrids that have (had) some features but not all features of an Object Oriented Language. (I think the remainder has been added since the early 80s, I'm talking origins, not current implementation).

The basics of Object Oriented Programming (OOP) are a few things:

  • Encapsulation -- this is localizing program fragments into objects, with each object containing it's data and code together. Everyone that does actions to an objects data SHOULD ask this objects methods (routines/code) to do it for them, and not alter the data directly. This localizes bugs (since most bugs are on data) as it localizes access (since all access goes through certain controlled points) -- which allows changes to be easier.
  • Data Hiding -- you want to keep certain parts of a program (object) hidden. There is the public interface (what programmers or the outside world sees and uses), and all the private implementation details (what others shouldn't know about or alter directly). If you only go through the public interface, then all the private implementation details are abstracted -- you can change them without it breaking everything else. Also fewer people need to know about them, so it is easier to learn and understand what is going on.
  • Inheritance and Overriding -- you want to reuse code as much as possible. If an object does something almost exactly like you want, then you want to "inherit" as much behavior (code) from that object as is possible, and only "override" (alter) very small amounts of code/behavior. This prevents you from having to rewrite everything to do something new.

Obviously there is a bit more to it than this, but these are the basics. There are more things (terms) that have evolved (like Polymorphism, Multiple Inheritance, Overloading, and more) -- but most are tools to implement the above concepts.

Object Oriented Operating System

The original Alto was programmed in Smalltalk (and some other OOD languages like BCPL, Mesa and variants). Smalltalk was a very nice Object Oriented Language -- that had some shortcomings. Like it was a memory pig (for the 1970s and early 1980s), it was slow, and required a lot of horsepower (big expensive computers) to really use it. Today most of those things are not valid any more -- but remember it took an entire minicomputer (in the 70s), to allow one user to run one program at a time (pretty much). Apple didn't have that luxury. They were trying to make a real system that people could afford. They chose Pascal for the public interface -- since it was a better syntax than C, and taught in all the schools, and they used assembly for their private implementations, because it made smaller code that was faster. Neither was a great language for doing OOP -- but there really weren't any other choices in the late 70s and early 80s (that ran well on microcomputers) -- so they made do. Remember, real artists ship -- and they didn't have the luxury of waiting around or rolling their own.

Many of the concepts of the Mac came from people who were familiar with OOP, so they were going to do their best with what they had.

They localized (encapsulated) code into many modules (managers). They had QuickDraw, Dialog Manager, Resource Manager, Toolbox, Controls, Window Manager, Event Manager, and so on. They did their best to encapsulate the design where ever they could. Others had grouped a little, but the Mac was just astounding with it's richness, and the logic.

This grouping went beyond just normal procedural modularization -- they localized data, and tried to make many public interfaces for accessing that data -- and tried to make some private implementation details that were hidden (data hiding). It was a very good attempt at OOD for 1984, before anyone else outside of academic circles was taking it seriously -- much less in microcomputers. Even concepts like having a Resource Fork and a Data Fork in the Macs Filing System can be thought of as a Public Interface (Resources: standardized data) and Private Data (Data Fork: which others can't alter). I didn't know OO theory back then, but I could taste it all over the Mac design.

The Mac OS had many different mechanisms for inheritance (overriding behavior and data). Many of the data structures had the ability to be added to or overridden. More than that, there was a mechanism for overriding almost any call in the system. Apple could replace them after the fact, and programmers could not only replace them (override), but they could inherit much of the original behavior as well (know as either a head patch or a tail patch). Even extensions and Inits were ways to inherit and extend the OS itself.

The whole concept of plug-ins was not only created on Macs -- it was how the MacOS worked to begin with.

So the Mac was really the first Object Oriented Operating System (for microcomputers).

Now don't get me wrong, Apple didn't have the luxury of having a good Object Oriented Language to use at the time -- so in some ways the Mac fell short. There were also scheduling pressures that prevented them from having the time to protect all the data (make it opaque), and have "accessors" to alter any data in the system. They also made things highly optimized (small, assembly language, integrated, etc.) -- which was great for users at the time, but made things harder to change and made things very interdependent -- which is anti-OOD (or so people figured out later). So it was only a really good attempt for it's time -- but it was a phenomenal attempt for that time.

Apple took the OO concepts further. For programming the Lisa (and later the Mac) programmers used the Lisa Workshop. Apple went on to create an Application Framework, and expanded the Pascal language to include classes (which they named Clascal). Apple then worked with the creator of Pascal (Nicholas Wirth) to standardize this language (Clascal) and put it in public domain, and it became Object Pascal. And Apple furthered their Application framework (collection of classes in a template Application) which became the Mac App Framework -- and third parties for Macs followed as well (like TCL). All this stuff was revolutionary for it's time.

Mac App was later borrowed heavily from by Borland to create OWL (for PC development) -- and of course Microsoft didn't want to be left out, so they made their MFC Framework after that (and borrowed from everyone else). Borland also later used Object Pascal, and kinda tries to take credit for Apple's work there too. In many ways Mac App was the first Application Framework on microcomputers and the foundation for the Windows Class Frameworks as well. (Smalltalk was also an App framework, but it wasn't really being used for micros or on what could be called mainstream commercial Apps -- though it was certainly nice for rapid development and some specialty Apps). I personally don't know of any Application Frameworks that existed before Mac App (for micros) -- but I'm sure some existed somewhere (Apple just popularized it), and there were lots of Class Libraries (Frameworks to do other things) -- just not ones that were template Applications, just plug in your extra code and go.


Apple wasn't the only company doing Object Oriented stuff by any means.

When Jobs later left Apple to create NeXT, he was still very excited about what the Mac had done with OOD/OOP -- and they wanted to further it. NeXT did further it (a whole lot) with their NeXTSTEP (OpenStep). First they improved on C by working with others on ObjectiveC (which is really a far better Object Oriented Language than C++) -- and they tried to bring OOD into much more of the Operating System, with some success.

In fact, NeXT's problems were that they were too far ahead. In 1989 the world wasn't ready for what they delivered as far as an OS. It was nice for the academic few -- but was way ahead of the mainstream, and ahead of the hardware capabilities to support it. NeXT also mistargeted a bit in a few areas (like the world needed yet another hardware platform at that point, or like we were ready to get rid of the floppy a decade ago). But it is a good foundation to start from.

Apple wasn't standing still either -- Apple was working with LISP people on Object LISP (CLOS). In response to NeXTSTEP Apple started two different next generation Operating Systems, Pink (which later became Taligent with IBM) and Amber (which later became OpenDoc with IBM and Novel). Apple also started a neat Object Language and development environment called Dylan. And another Framework (cross platform) called Bedrock (with Symantec). Apple was all over OOD, and has innovated, or tried -- but sadly most of their projects failed for one reason or another, even if their innovations and lessons learned live on.

Dylan was a timing failure. Bedrock/Symantec was a partnership failure (but much of the code was later borrowed for OpenDoc or ODF). Taligent was lost to politics (once spun out they had no influence in Apple, and IBM had lots of control -- so Apple lost interest, both NIH and loss of direction). OpenDoc lost momentum because Novel dropped the ball, IBM got behind and started doing their own thing, and Java got too much hype despite there not being that much overlap in critical areas. But all were good efforts and did far more than people give them credit for.

Others have come around and tried to make more Object Oriented Operating Systems. Most notable is IBM with OS/2. IBM's SOM, and OOD ideas, in all (most) of their OSs, has been trying to push some boundaries forward -- in that slow, plodding, huge, annoyingly over-engineered (but very versatile) IBM way.

Even Be with their BeOS is trying to do something -- which is fast, light, Object Oriented. To bad they chose the worst possible language for what they were trying to do (C++). To really do what needs to be done, you can't use C++ alone, which is part of the reason why IBM created SOM to fix the problems in OS/2, and why NeXT had co-created (worked with) ObjectiveC in the first place. Even Microsoft created their proprietary, and somewhat pathetic hack, known as COM (DCOM) to try to fix the problems -- now in it's 3rd or 4th generation it is almost usable (2 more generation and it will be too bloated to use, if Microsoft follows their usual product evolution). Java is good as well -- and getting better all the time. C++ sucks, both in syntax and that it is just too damn flexible for good OOD -- it is like trying to build a skyscraper out of barbed wire and papier-mâché. You need more rigidity (less flexibility), and more rules.

Personally, I'm an OOD advocate, but not a purist. I believe in OOD at all the high levels (and most public, high level interfaces) -- but I also believe in nice, tight, C (Procedural) API's underneath. I don't know that drivers or plug-ins should HAVE to be OOD, or that some low level stuff doesn't work just fine as procedural code -- as long as it is compartmentalized (encapsulated) by it's own purpose. A fast foundation, and some more flexibility where needed (full control over memory), and where most people won't have to interact with it. I think Carbon Libs and YellowBox have a pretty good chance to migrate in the right direction (one downwards, the other upwards) and do a pretty good job of both -- and a layered architecture.

Microsoft of course jumped on board the bandwagon in 1994 (5 - 10 years late) and promised a fully Object Oriented version of NT (sort of a follow on to NT) by like 1995 -- it was called Cairo (back when Windows95 was called Chicago). Other than those being a rip-off of the original Mac font names, I thought both cities were amusingly appropriate since Chicago was the home of gangsters and riots and Cairo is known for pickpockets, disease, and making bad deals with street vendors. Of course Microsoft hyped up Cairo (to developers) and promised the world (you should have heard the features they were ready to deliver "any day now") -- and then once they scared off the competitors it was never heard from again. I don't think it ever got beyond the campaign promise sort of marketing, and wonder if there was ever a line of code written. But enough with the vaporware.

So far, NeXT has done the most to actually deliver an OO-OS -- but Be is trying. NeXTSTEP is much richer and more finished -- Be is lighter and a little faster more modern (but has the burden of C++). Sadly, I think the idea of an OO-OS is still ahead of its time (we've got a few more years to go). Java as a syntax is as good as ObjectiveC (only 10 years later), in fact, I think Java is actually a bit better -- and in a few years there will be enough there to actually use (for an OO-OS, there is already enough to use it for other things). But Syntax is only part of the battle -- the framework matters more, and OpenStep is so much better than Java's various frameworks that Java is a pathetic shadow in comparison. I think Apple/NeXT again are in the best position to exploit Java and OOD with YB (YellowBox) -- but IBM isn't too far behind, and Sun makes Apple look humble and doesn't even realize what they need to do. (Sun brings NIH to new levels of arrogance). Microsoft will wait until someone else proves things, then come in and copy and take credit for it all (like they are known for). So it will be interesting -- but all this is wandering off topic.

More Mac concepts

There were so many new concepts done on the Mac it was just astounding.

  • Not only was it very Object Oriented but it was event driven. The program waited for events (for the user to do something) and then responded to them -- in an era where programs were usually mode or command driven. Now event loops weren't that new, I used them and thought of them myself before Macs (and I wasn't that unique) -- but to have it a part of the OS, and that way for everything (not just some game program), and to have so much detail to these loops and richness of events was astounding. The events were even standardized! Wow!
  • There were graphic libraries before Mac as well -- but nothing that came close to QuickDraw. I mean many had features -- but not as tightly integrated, small, fast, and so on. And most libraries weren't built into the OS but were bolt on extras (nonstandard).
  • Microcomputers before the Mac weren't always in bitmapped (graphics) mode to really use those graphic libraries. You usually worked in text mode (characters only) -- and only went to graphics for games, painting, or to show what something might look like on the printer.
  • WYSIWYG imaging existed before Macs in Xerox PARC Labs, in some test programs -- but it wasn't part of every application in an OS. It was the exception and not the way things worked -- until the Mac. WYSIWYG still works better on the original Mac than it does on PCs today (in many subtle ways).
  • Minicomputers had nice software abstraction layers between the Applications and the Hardware -- so that the hardware could change and evolve without breaking all the programs -- but the Mac was once again the first microcomputer to do it, and it did it amazingly well (better than many of those $100,000+ minicomputers). This is why when years later Apple came out with Color Macs, not only did most software still run, but some Applications actually took advantage of new features automatically!
  • Sound existed before the Mac -- but no machine had the rich digitized sound and sound libraries built into the machine, and it came with speech synthesis libraries and so much more.

There are so many software and programming changes to microcomputers made by the Mac it is just astounding. Apple released more changes to the way people thought about programming and designing computers on that one day, that Microsoft has in 25 years -- probably 100 times more. And I'm not just playing advocate -- I know both companies and what they've done.


When the PC finally got Windows it was like a really bad rip-off of the Mac. Andy Hertzfeld looked at the design and programming of Windows and said "it looks like the Mac Toolbox was hit by a Bizarro-ray!" Microsoft didn't just steal a few loose design elements from the Mac, they were stealing programming architecture (and implementing it really weird and poorly, and hacking it up). Apple innovated and created new concepts and furthered many more. They weren't using the same language as Xerox, and they didn't have anything vaguely the same in design. Microsoft stole the Mona Lisa (or Mona Mac), and then "improved it" with finger paint, spray paint, string art, and a little calves blood splattered across it for effect. Which is bad enough -- but then to hear Windows users talk about the superiority of those improvements (usually from a position of supreme ignorance) is really frustrating.

I'm not saying that Microsoft didn't improve anything -- they did. But small improvements in the face of all that mindless vandalism is not an improvement over all. It was one step forward, ten steps backward, twenty to the side, and then running around in circles until you were dizzy and had to throw up!

There were so many changes to the paradigms when the Mac came out that some people claimed "it is hard to program Macs" -- which is a fallacy that exists to this day. Sure, if you were an old-thinker, the new ways of thinking were tricky. But if you wanted to do the same things the Mac was doing on a PC, it took 100 times more work. First you had to implement the millions of lines of code Apple had created for you, and so on. But they weren't doing that -- instead of graphics, they were spewing out text only --instead of being event driven, they were using archaic modes and commands. You could still program backwards on the Mac as well (there have been standard I/O libraries on the Mac for a decade+), and it is just as easy to write bad code as on the other machines. But the Mac allowed new things, demanded it -- and people don't like change. The Mac was new, and brought so many improvements to microcomputer design and programming it still astounds me just trying to edit out enough to fit thing into a series of articles.

Yes, the Mac did borrow some interface concepts that Xerox had done (after Xerox had borrowed and furthered research in many areas). But Apple was out there furthering, improving and innovating on everything they touched -- many times they took elements that Xerox had furthered by a meter, and Apple furthered them by a mile -- and sometimes it was the other way. But Apple took us further than we had gone before. The Design and Programming of micros is one of those areas -- and the results are that a lot of Windows (and all GUI programming) looks a lot more like a bad version of the Mac, than it looks anything like Smalltalk, or the Star.

In a few years, that might change -- as Smalltalk like concepts are finally catching on today. Smalltalk wasn't bad either, just way before it's time (before the hardware, and even software, was powerful enough). Apple's Dylan was basically an Object Oriented LISP (with smalltalk roots). NewtonOS was a little Smalltalk like -- and for a while almost became more so. IBM's VisualAge for Java is a marriage of a Smalltalk like dynamic environment and Java syntax and is pretty cool stuff. And YellowBox has the possibility of also becoming a more smalltalk like dynamic environment.

It took like 10 years for Windows to get an Application Framework that was a fairly poor imitation of the multiple Frameworks that existed before on the Mac. Now I hear Windows programmers talking about MFC like it is some great contribution to the world, and don't know if I should laugh or cry at their ignorance. I hear them talk about some subtlety of programming that they talk about on the PC as if it is "great" and a Microsoft gift -- and then they ridicule the Mac for not having it, when they should be thanking Apple for creating it a decade before Microsoft copied it (poorly). PC weenies talk about OOD and how the Mac is backwards, and they don't realize how far ahead the Mac was. It is often the others ignorance that make them think that Mac programmers are zealots.

I know that Apple isn't perfect. Apple has sort of pissed away a gargantuan lead in concepts and technology (in many areas) -- but it is always easier to follow a path (and move faster) when someone else has blazed and marked the trail for you. In many areas other companies have furthered things beyond Apple -- but usually in very specific areas, while ignoring major areas, and ignoring the ramifications and subtleties that are so important. In other ways, legacy always hold companies back -- and Apple had more legacy in these areas than anyone else. Apple has also made some mistakes that have slowed them down -- but even most of those mistakes are because they were trying to do something new. OpenDoc, Taligent, Bedrock were all risks that really tried to make a real difference to users and programmers -- I believe some were false starts or "learning opportunities" that will help us (and them) in the future. OpenDoc concepts brought forward to YellowBox could be great. Web Objects or YellowBox Apps running cross platform commercial quality Apps are possible (probable) and is everything that Bedrock promised (and more). OS X (w/YellowBox) and Taligent are basically the same thing (an Object Oriented Operating System -- built on UNIX). So all the developments have taken a little more time than expected and have some false starts -- that's OK, they can still deliver, and still be innovative. What has Microsoft given us (innovation) besides Cairo (empty promises)?

Some people think that I (or others like me) are changing tune all the time. I thought the Bedrock idea was a great idea. I thought that OpenDoc and Taligent were great ideas. Now I'm advocating Java and OS X / OpenStep as the right concepts (I'm more cautious on committing resources to ideas until people "show me the goods", but they are the right direction, even if they fail to be the right implementation). People don't realize that the ideas have stayed the same -- just the implementation keeps getting tuned. I still want the same things -- a more object oriented, cross platform, distributed processing, MP aware, Operating System / Runtime Environment / RAD development tool, preferably with a more component based architecture.

Innovation is done in spurts. The industry has been pretty stagnant for a while. The little nice evolutionary things are cute and important -- but not always to the level that I call "innovative" (in the large scheme). The real innovation may come again -- if not by Apple by someone else.

Sorry BeOS people, but Be ain't it yet, stop telling me it is. BeOS is faster, lighter, and there is little that I've seen that is truly innovative, even as a programming model. I can't think of anything that it is doing that hasn't been done before -- though their MP support was close. If they were to fix the FBC (fragile base class issue), reimplement everything in Java (compiled), make everything a true cross platform library (that could run on any machine), and make things run across the Internet (distributed processing) -- then I'd say Be is really innovative. For now they are just sort of the return of the Amiga -- a neat, fast, hackers tool -- that may empower someone to create some new and unique solutions. But I have yet to see any truly new solutions come from Be, or anyone using Be, yet.

I think the seeds are sown for some real improvements in programming model and the way computers are designed. Java. OpenStep (YellowBox / Web Objects). Dynamic Compilation. Fat-Apps (multiple binaries and embedded source). Ubiquitous MP and distributed processing. Document Centric. Cross Platform. Agents. Speech Recognition (as one very small element). Vector Processing (SIMD). Universal Scripting. New imaging. 2D + 3D performance. Networking. Streaming. There are a lot of potentials for some real changes -- and Apple is in good positions for all of them -- but no one exploiting them well yet -- and enough haven't come to together into an integrated whole to change how we think. I'm hoping WWDC can blow my socks off, and that Mac OS X can do for computer programming/design what the Mac did 15+ years ago -- realistically I think legacy, market pragmatics, and many other things are going to reign in breakouts, and keep things very evolutionary for a while longer. Still I see a lot of nice evolution potential in the Mac -- and I'm sorry, but more proprietary, half-finished, hacked in features from Microsoft just don't do it for me.

Thanks to Bruce Horn for some helpful editing.

Created: 05/06/99
Updated: 11/09/02

Top of page

Top of Section