Advocacy

  Myths
  Press

Dojo (HowTo)

  General
  Hack
  Hardware
  Interface
  Software

Reference

  Standards
  People
  Forensics

Markets

  Web

Museum

  CodeNames
  Easter Eggs
  History
  Innovation
  Sightings

News

  Opinion

Other

  Martial Arts
  ITIL
  Thought


What is an OS
Sounds like a simple question -- but is it?

By:David K. Every
©Copyright 1999


I've been debating with some UNIX types of late -- and with them there is some serious debate about "What is an OS (Operating System)". See, to many UNIX users/programmers it is only the kernel and maybe POSIX (the lowest levels of an OS), and what the programmers see -- but that definition is way too narrow for me. Rather than argue the same thing a few hundred more times, I figure I'll just write an article, put it up once, and (hopefully) explain these things to a lot of people along the way.

The olde days - riding the iron

Back in the 'ye olde ancient days of computing (1960's for big computers, 1970's for microcomputers), programmers had to do everything. There was a bunch of chips (or tubes), and programmers had to tell the computer EVERYTHING to do a job. You had a processor that could run instructions, individual support chips that could do things (like send characters to the screen, or read characters from a keyboard port, etc.), and you had to write little routines to talk to the keyboard chip, the display chip, the serial chip, and so on to do your most basic I/O (Input/Output) function. All the lowest level code (the stuff that talked to the support chips) was hugely redundant since everyone who was writing a program had to do the exact same things all the time.

Coding to the chips directly is known (in technical slang) as "riding on the iron" -- you were talking to the hardware (iron) directly. The makes you very dependant on each chip chosen for use in a computer, and the computer maker can't change chip models since it is likely to break many programs (that were counting on the old chip).

Drivers

Fortunately time marches on, and computers (and software) progressed. Computer manufacturers figured out that it was stupid to make everyone write the same code (or add the same code to all their programs) -- so they made hardware abstraction layers (very primitive ones by today's standards) or libraries. These chunks of code (routines) do the most common things that most programmers needed these chips do for them. Instead of writing all that code themselves, they could ask the "driver" to do things (drive the chip for them) and give them the results. This saved a lot of work.

It also had some other benefits. If the computer maker wrote the driver well enough, they could change the model of chip they were using to do something. If they updated their driver to talk to the new chip, and all the old programs would still work. So you "abstracted" (put a layer between) the hardware from most programmers, and allowed for more diversity in computer design, and more evolution in hardware.

BIOS

Soon after a few drivers were written, they were collected into a group of them to do the basics. In microcomputers this got called the BIOS (Basic Input/Output System). There are other terms, but basically this was just all the basic routines that a programmer (and programs) would need to talk to the hardware.

In the early days, the BIOS was the "Operating System" -- it was a bunch of code that let programmers talk to the hardware (and made the hardware more usable). If all your "users" were programmers (and back then they were), then that made sense -- that was the proper way they operated a computer (though programs they wrote).

Command Lines / Shells

Of course as things matured. Eventally not all the users had to be programmers, and there were many common things that everyone wanted to do with a computer that went beyond just calling an I/O chip to figure out the last key pressed, or to spew text onto the screen (or line printer). Many people wanted to read and write files (streams of data) to storage devices, and get them back (for their programs). Also once they wrote Application (programs) that would do all sort of things (and create lots of files), they needed to manage the files. So programmers created "command-lines" (or the CLI, command-line interface) which would allow users to enter commands to tell the computer what to do. Like copy a file, move a file, delete a file, print a file out, or execute a program.

Before command lines running a program was either automatic (you booted the computer into your program), or you manually loaded the program with another small program (called a bootstrap), and it ran.

Of course programmers could also call these higher-level CLI routines to do things -- so instead of talking to a driver, to talk to a chip, to tell a device (like disk or tape drive) to write a single character at position X of tape Y, the programmers talked to these higher level routines to tell the OS to write a whole file to the disk. This was much easier for programmers.

The CLI allowed users could type simple commands to run programs or to manage files. But people always want more. Eventually people added scripting to the CLI so that you could automate many of these commands and one "macro" (series of commands tied to a single name). This improved the more tedious parts of operating a CLI (and so that you could change cryptic commands to suit your needs, and so on). The Command-Line is a "shell" -- an interface for users (and many programmers) to operate a computer. Many users think of the shell as the OS (or part of the OS), since it is a way they operate the system -- and so do most people.

Depending on the implementation the programmers interface, and the users interface can be two separate things -- the programmers side being called the Filing System and the user part being called the shell -- but they are not always clearly separated (often they are different views into the same thing). But usually the user interface side (the CLI shell) is sort of stacked on, and dependant on the Filing System -- just like the Filing System is dependant on the lower level drivers, and so on.

Because of the sometimes separation, some UNIX types do not consider the shell part of the OS (but they do usually include the programmers side -- the filing system). They claim that only the programmers interfaces count (not the users interface) -- since the implementation of the CLI can change slightly (or you can write your OWN CLI on top of the Filing System if you want). I think that is pretty silly differentiation since you could write your own filing system if you wanted as well -- and many do.

The predefined code that allows a user to operate a computer (programmer or user) is part of the OS. It all does the same thing -- it allows people to input and output things from the computer easier and it comes with the computer (or OS) to facilitate that input/output. Many programs and OS's REQUIRE a shell to operate since they talk to the shell to get results they need (or to create files, or automate things, and so on) -- so even from a programmers point of view, often a shell and filing system can be intertwined or interdependent -- and they are just a collection of routines that sits between programmer and the hardware. (Like on the Mac a programmer can ask the filing system to do low level things, like open a file and read a character, or they can talk to the Finder directly and ask it to do higher level things, like duplicate, print or delete a file or folder).

GUI's (Graphical User Interfaces)

Of course command-lines are still programmer centric geeky ways to access the computer. You remember a huge list of commands, and symbols to modify the commands, and you type a lot, and you tell the computer what to do. Every command line was different (for different computers), so you often had to remember hundreds of commands, and variants, in many different flavors (for different machines). Because of the complexity of the commands you could do some nice things -- like copy all files that only contained the letter 'J' to another directory, and so on -- but usually it was a lot of work for not much results -- or you spent a lot of time writing your own commands (scripts) to get it to be more powerful.

Evolution progressed and people learned that while command-lines are better than nothing (and OK for programmers and some things), there was an easier way for most people. We borrowed concepts of Human-Interface (Man-Machine-Interface) by creating "metaphor", and giving people graphic representations of files (Icons), and creating a direct manipulation interface.It made it far easier to teach people to use computers -- and more people could use them more efficiently. GUI's allowed you to present windows on the screen, and have menus of commands (so you didn't have to have them all memorized) and so on. Far more people have accepted computers and use them today because of GUI's and improvements in Human Interface.

Creating a Graphical User Interface was a huge amount of libraries (routines) for programmers, and a GUI-Shell (Application that users use to manipulate files -- like the "Finder" on the Mac). Most Programmers usually considered the GUI API (Application Programmers Interfaces) part of the OS, and most users consider the GUI Shell (Finder) part of the OS as well. After all, they do the same things as an OS -- they sit between the user (or programmer) and the hardware, and they make the computer easier and extend its functionality. Most programs require the graphics routines, or the shell itself to send messages to the user and get results back, and do actions. Most users require a GUI shell. It is all the same thing -- prebuilt functionality added to your computer to make it easier to interface with (either as a programmer or user).

Still, some narrow minded programmer don't want to consider anyone else's way to interface to the computer part of the OS. So to some programmers, only the API's are the OS. Some Low-level programmers want to go further and call only the lowest level routines (the BIOS) the "OS", and try to exclude all that high level "shell" stuff as "not part of the OS". Users of course think of the shell (graphical or command line) as the OS, because it is their interface into the machine, and forget about all that programmer stuff. Of course it is naive to be exclusionary and assume that ONLY one level of interface is the OS -- the OS is the collection of all those things (user, programmer, low level programmer interfaces).

Kernels

Now as computers progressed, people wanted to do more than one thing at a time. So they created ways to "schedule" multiple things. If you give two programs a small amount of time, very quickly (say each program gets 1/30th of a second, then control is given to the other program for 1/30th of a second) , it will seem to users that both programs are running at the same time. This is called "time-slicing", and is a common way to do "multitasking". Just give small slices to many different things, and they will all seem to run at once. Since users operate so much slower than computers, it is easy for computers to be doing lots of things, and still respond to your actions. These scheduling routines are called, appropriately, "schedulers".

Now as long as you have all these processes/tasks (things) running at the same time, you often want to talk between them. This way you can divide a program into multiple parts, and have all the parts doing something and cooperating (communicating) to achieve an end result -- very powerful stuff (or it allows for more powerful programs, with things localized so multiple people can program their section at once, and so on). This communication is one form of messaging, and is often called IPC (Inter-Process Communications).

A slight problem with having multiple programs running at once, is that one bad program can write outside its own memory, and step on another program (break it). So we need to help protect against that. So people wrote memory protection -- which guards one App from others, by only letting each program see it's own memory (and no one else's).

These three things (Protection, Messaging, Protection) are usually the basic elements of any Kernel (core) of an Operating System. So they got "broken out" in name, and functionality and put as the code (kernel) of an OS. They concepts weren't all developed at the same time, but they make sense as the "basics".

Some narrow minded low-level programmer want to call only the Kernel the OS. Most programmers call ALL levels of the OS the OS (anything they "call" to get work done). It has to do with perception. Of course Kernels were created long after "OS's" were -- so how you can call only the Kernel the OS is beyond me -- but some do, or think that way.

Layering

In fact as OS's added more and more functionality, and did more and more things, it made sense to layer more and more parts on top of one another. So the lowest layer is the kernel, and above that you have file I/O, Device access, Drivers (the older BIOS type stuff), and above that you may have graphics libraries and GUI services, and so on. You stack all these things on top of one another, until you have everything you need to write programs for that computer.

More services got added on top. Like for doing speech synthesis or speech recognition, some OS's (like the Mac OS ) come with libraries to do this. For doing multimedia (graphics, sounds, video, time based events) you have multimedia libraries -- like QuickTime. For doing 3D you have 3D libraries (like OpenGL or QuickDraw3D). For networking you may have OpenTransport (or other network stacks). And the list of services goes on.

Conclusion

As you can imagine, time is marching on, and we are adding more and more layers to Operating Systems, and putting more in each layer. We keep adding on top, and adding higher and higher level functionality.

Many people want to call only certain things the OS. Just the programmers interface, or just the users interface. Even programmers go further -- they only want to call some level of programming interface "the OS", and exclude everything else. A common one is that they want to say that QuickTime is not part of the OS -- yet many programs require it to actually run, and it comes with the Operating System you install (if you buy Mac) -- or many apps require it or another similar library to be installed (like DirectVideo on Windows or what used to be called VfW, Video for Windows) before they can run. They do the same thing as other parts of the OS -- allow for easier input/output of different types of data to interface with the user or programmer. So in my definition all the required libraries that come with an OS, that ease interface between user/programmer and hardware, is part of the OS.

UNIX people do have some valid points. They get annoyed because they ask "where do you stop?" Macs come with many control panels to control different things, and desk accessories and even Applications and Utilities. You have Drive Formatting Utilities, and Text Editing, Calculators, and PostIt notes (stickies). Also there are Help Systems and QuickTime (with the ability to play movies). Now even WebBrowser and eMail are included with Operating Systems -- are they part of the OS too? Microsoft goes further and tries to force (encourage through illegal product tying) companies to put Microsoft Office in the same box (always) as the OS. Does that mean that Office is becoming part of the OS? It gets ugly. But I think that if the future in order to use your computer (as both user and programmer) it REQUIRES Microsoft Office to run many Applications, and to access most files and so on, and it comes with the Operating System (and is installed) and everyone is dependant on it -- then at some point, even Microsoft Office will be part of the OS (or similar functionality).

This argument actually helps Microsoft in some of their justice department lawsuits, since they claim that the Browser is part of the OS -- and like it or not, that will be true some day. Apple already had .html based help system, and NeXTSTEP came with a Dictionary and an eMail system standard nearly 10 years ago. In that one case, Microsoft is right -- the OS will grow to include many things that today we call "Productivity Apps" -- just like it has grown to consume more and more things that programmers used to have to write themselves to do.

It gets gray and confusing, and there is a bit of a philosophical war going on. Unix only has standards to a certain level (pretty low level compared to other OS's) -- and many more things are "add-ons" and nonstandard as compared to other OS's -- so everything above their "defined" level (POSIX) they don't always want to call the "OS". You get comments like, "Oh, those are just services you add on top" and bias like that -- it can get ugly to define anything, because people want to have religious wars over stupid trivialities.

What is worse is when laypeople have heard something is the OS and something isn't, and they argue without understanding. That's what this article is for, a little understanding. Hopefully at least now you understand about the "What is an OS" debate is all about.


Created: 01/24/98
Updated: 11/09/02


Top of page

Top of Section

Home