The history of Linux starts properly in 1984. In that year, a system administrator working at Massachusetts Institute of Technology (MIT),
Richard Stallman, received a new version of the UNIX flavor they were using. But in contrast to previous versions, this time they did not
receive the source of the operating system with it, and could not obtain the source separately without signing a Non-Disclosure Agreement (NDA). Richard Stallman was therefore not able to implement a certain additional feature into the operating system which his users had come to like.
Richard Stallman became so upset with these developments in general that he vowed to write a new UNIX-like operating system from scratch. That new operating system was supposed to be free (as in free speech): Everybody would have the right to use and adapt the software for its own use, and to distribute the software to others. This project was called GNU, which stands for GNU's Not Unix. To fund the GNU project and to advocate the use of free software in general, the Free Software Foundation (FSF) was founded.
The first steps taken by the GNU project was to re-implement various essential utilities in a UNIX operating system. Although hundreds of
little tools were written, four tools stand out:
- The GNU C compiler (gcc), which was essential for compiling all software, including the kernel and the C compiler itself.
- The GNU C library (glibc), which implements a large set of standardized system calls.
- emacs, which is a full-featured, world class editor which can be extended into a sort of application development environment.
- bash (Bourne Again Shell), a command interpreter and programming environment.
Having a shell is essential on a UNIX system, since the shell interprets and executes the commands you type. Later on, the GNU project also started development on a UNIX-like kernel, called Hurd.
This kernel has never been important for Linux however, since it was released for the first time at the end of the 1990s, when Linux was already thriving.
In 1991, a student at the University at Helsinki, Linus Torvalds, started a small research project into the workings of the Intel 80386 processor, which by then was state-of-the-art. He was interested in exploring a new feature which up to then was not present in any Intel processor, namely a Memory Management Unit. This MMU offered hardware support for running multiple processes simultaneously, each in its own memory segment. With such an MMU, processes cannot access memory areas owned by other processes, and this effectively means that if one process crashes, it cannot take the whole system down with it.
The operating systems that were available for the 386 (Windows for Workgroups and Minix) all did not use this feature and were therefore very prone to crashing.
Linus started out writing three small programs:
- A small program which continuously printed the letter A on the screen.
- A small program which continuously printed the letter B on the screen.
- A slightly larger program which switched the processor to "protected mode" and scheduled the other two programs to take turns.
When Linus finally managed to see the output of both programs on his screen, in turn (ABABABAB...), he knew he had the beginnings of a kernel of a multitasking operating system.
Linus continued to improve and refine the kernel, and at the end of 1991, he was able to run the GNU C compiler and the Bash shell under his kernel, which by then was dubbed Linux, for Linus' UNIX.
Linus then decided to upload this to the internet (which by then was still largely a university network) for others to use:
From: torvalds@klaava.Helsinki.FI (Linus Benedict Torvalds)
Subject: What would you like to see most in minix?
Summary: small poll for my new operating system
Date: 25 Aug 91 20:57:08 GMT
Organization: University of Helsinki
Hello everybody out there using minix -
I'm doing a (free) operating system (just a hobby, won't be big and
professional like gnu) for 386(486) AT clones. This has been brewing
since April, and is starting to get ready. I'd like any feedback on
things people like/dislike in minix, as my OS resembles it somewhat
(same physical layout of the file-system (due to practical reasons)
among other things). I've currently ported bash(1.08) and gcc(1.40),and
things seem to work. This implies that I'll get something practical within a
few months, and I'd like to know what features most people would want. Any
suggestions are welcome, but I won't promise I'll implement them :-)
PS. Yes - it's free of any minix code, and it has a multi-threaded fs.
It is NOT protable (uses 386 task switching etc), and it probably never
will support anything other than AT-harddisks, as that's
all I have :-(.
Other people on the internet started picking up this software, started using it, and refined it. The patches were sent to Linus, who incorporated this into the main kernel stream. Starting to use Linux was a major undertaking, however. The Linux kernel, the C compiler,
the shell and all the other tools you need to make a complete operating system were all distributed in source code form. Before you can make use of them, you need to compile them, which requires a C compiler, which itself also needs to be compiled first... To break through this vicious circle, people started creating "distributions", which contain a precompiled kernel, a precompiled C compiler, various precompiled tools and some sort of installation program. All this is stored in a convenient format for installation in CD-ROM images used today.
To understand what's so special about Linux, it is necessary to quickly look at international copyright laws. The principle of copyright is very simple: When an author creates a unique piece of work, such as a computer program, then he is the owner of all rights to that piece of work. He may decide what others can and cannot do with it.
What others may do with that piece of work is usually written down in a License Statement, a contract between the creator and the user which describes the rights that the user has. These rights may be granted for free, but in most cases the user has to pay for them.
A typical license in the world of computer software entitles the user to run the binary program on the number of machines that the license was purchased for. It is not allowed to make more copies of the software than needed for running it, and one extra backup copy.
Furthermore, the user cannot claim any rights to the source code and is not allowed to dissemble the binary code to learn and/or alter its inner workings. In short, a typical copyright statement does not give you the right to copy.
In contrast, the GNU General Public License or GPL for short, turns this around. The aim of the GPL is to keep all software "free", so that
everybody can adapt the software to its own needs, without being dependent on the goodwill of the author. This means that any piece of
software that has been placed under the GPL by the original author gives the user the following rights:
The user may copy the (binary) software as often as he or she wishes.
The user has the right to obtain the source code.
The user has the right to alter the source code and recompile the source code into binary form.
The user may distribute the sources and the binaries.
The user may charge money for all of this.
Basically the only restriction that the GPL imposes on all users is that the license statement may not be changed. This means that all your
customers have the same rights to the software as you do. And as a practical aside, that means that in general it is impossible to make any money from selling the software (apart from a nominal fee for media and distribution).
The GPL is the most-often used license statement in the Linux world, but other open source licenses, such as the BSD license, are also being used.
The effects of this license model are far reaching:
The first effect is that, since everybody has access to the source code, everybody interested can improve the code, or add new features. This means that software development is very rapid, with potentially hundreds of developers working on the same piece of code. People in the Linux community understand the inherent risk of a code fork with a development model like this, and a lot of effort is spent in coordinating the work of various developers. This usually comes down to two things:
A volunteer or group of volunteers who take up the coordination of the development. Linus Torvalds for instance hardly writes any code anymore, but spends most of his time coordinating others who write code for the kernel. And other people coordinate the development on other programs.
Some sort of automated support for distinguishing and integrating contributions of developers. Most often, the CVS (Concurrent Versioning System) is used.
As an example, the sourceforge.net website hosts thousands of projects who all are managed using CVS.
The second effect of having the source available is that peer reviews are possible. It is easy for people to look through the code and
identify any performance or security problems.
In fact, there is currently a "Linux Janitor" project underway which aims at auditing the Linux code automatically, searching for typical,
well known programming errors. This is most likely the first time where large-scale white-box testing2 is being automated.
The third effect of the license model is that if you make any changes, or add a feature, then that feature is owned by you, and not by
the original author of the software. This means that your name (as part of the copyright statement for that feature) stays in forever. This is usually a great motivation factor for people.
For a large number of people, Linux is not just another operating system, but it has become a way of life for them. It is something they
believe in, and they want to express that belief. Nowadays, Linux supports a lot of hardware.
Free software for the free world :)