Internet Technical Bookshop

Books and Reources for the Sciences and the Arts

History and Importance of C

In the Scribend Programming as a Second Career - Embedded Systems and the Internet of Things I pointed out that C is a very important programming language for implementing IoT and Embedded Systems applications, and that mastery of C does have great career potential. Although developed in a Coroporate environment the culture in which C and Unix were developed was very much a "hackerish in spirit" culture. There is much interest nowadays in the Internet of Things (IoT). Most of the small gadgets / widgets that populate the IoT space are embedded systems, and these are programmed in C. These small systems, unlike Workstations and the larger server machines which run things such as databases and web servers, for example, are called embedded systems. They have smaller processors on board which are called micro-controllers, and have relatively small amounts of memory (tens to hundreds of Kilo Bytes (KB)). By and large the code that runs on them is written in the C programming language. This is why learning C can be the start of a very interesting programming career.

C was originally developed by Dennis Ritchie between 1969 and 1973 at Bell Labs, and was used to re-implement the Unix operating system. It was designed as a language that was much easier to code in than assembler, and yet one that was "quite close to the hardware" that would run the compiled code. It has since become one of the most widely used programming languages of all time, a testimony the "genius quality" of the reasearch community at AT&T Bell Labs at that time. The original Unix at Bell Labs was written in assembler, and writing code in assembler, for most of us, is a painful and difficult process. If you want to get a taste of writing code in assembler you might take a look at the book Little Man Computer Programming: For The Perplexed From The Ground Up which I wrote to help teachers using it in A Level Computer Science courses, and also, because I used it in a distance learning Introduction to Computing course I taught for UMUC (University of Maryland University College). The whole point of C was that the migration from Unix with the code entirely written in assembler to Unix where the code was mostly written in C, with a small amount of assembler for the machine hardware dependant parts, resulted in code that was both easier to read and that could be more easily porte to other machines. The last bit was not entirely trivial as you needed to have a compiler to compile the C code into the target machine assembler, from which it could be converted to the machine code instructions. And this is the amazing part the the various guys at Bell Labs developed not only a C compiler, but also tools for building compilers such as Lex (Lexical Analyzer - Mike Lesk) and YACC (Yet Another Compiler Compiler - Stephen C. Johnson). Truly amazing teamwork. But then what else would you expect from a research lab that invented the transistor, and discovered cosmic background radiation that provided evidence in support of the big bang theory.

The UK, can, actually claim some credit for the C language, as many of the key ideas of C stemmed from the language BCPL, developed by Martin Richards, who studied mathematics as an undergraduate student at the University of Cambridge and then took the Cambridge Diploma in Computer Science. His PhD was on programming language design and implementation. The influence of BCPL on C was in fact via a language called B, which was written by Ken Thompson in 1970 at Bell Labs for the first UNIX system which ran on a DEC PDP-7. Interestingly, the original BBC Computer could run BCPL as an extra language, in addition to BBC Basic.

Bell Labs, being part of a large telecomms corporation, later broken up into a number of smaller companies referred to as "Baby Bells". Briefly, The Regional Bell Operating Companies (RBOC) , knicknamed Baby Bells, were the result of a United States v. AT&T, the U.S. Department of Justice antitrust suit against the former American Telephone & Telegraph Company (later known as AT&T Corp.). On January 8, 1982, AT&T Corp. settled the suit and agreed to divest its local exchange service operating companies. Effective January 1, 1984, AT&T Corp.'s local operations were split into seven independent Regional Bell Operating Companies known as Baby Bells. Before the suit AT&T was not allowed to develop a software business. The actual legalistic context was that under a 1956 consent decree in settlement of an antitrust case, the Bell System (the parent organization of Bell Labs) was forbidden from entering any business other than "common carrier communications services", and was required to license any patents it had upon request. During a "Symposium for Operating Systems" organised by Purdue University in 1973 Ken Thompson and Dennis Ritchie presented a paper on Unix. Professor Bob Fabry of Berkeley University asked Thompson and Ritchie if he could have a copy of their operating system, they were happy to oblige. In January 1974 a tape containing Unix version 4 arrived at Berkeley. A lot of early UNIX systems ran on DEC (Digital Equipment Corporation) PDP 11s. Getting the system at Berkeley to work reliably required a good deal of cooperation between Berkeley University and AT&T, including for example Ken Thompson doing some remote debugging over a (slow) modem line from the East Coast of USA where Bell Labs was located.

Although AT&T Bell could not market Unix, people at Berkeley did. The first Berkeley Software Distributiod (BSD) appeared in 1977. It was made up of the Unix operating system and a Pascal system, perfect for undergraduate programming courses. Version 2 of BSD followed on shortly (1978) and included "goodies" such as termcap (a universal screen driver) and the editor vi. However, the Unix operating system was the property of AT&T, and all BSD users had to acquire a license for it from AT&T. BSD's widespread adoption within the university system did much to cement the popularity of Unix. is probably one of the reasons for Unix' popularity, but a decision made by the Defense Advanced Research Projects Agency, an agency within the US Department of Defence, Department of Defence Research, was to further contribute to Unix' popularity. Another string to Unix's bow was the rise of networking in the guise of the DARPA (Defence Advanced Research Projects Agency) project to develop a computer network to replace the original ARPANET network. For practical reasons a decision was made to standardise on a single operating system, and the one chosen was UNIX. In 1983 the US Department of Defense split the Arpanet into two - MILnet for military sites, and ARPAnet for civilian research sites. ARPAnet was the precursor of today's internet. In 1986 the IETF (Internet Engineering Taskforce) was established, and in 1987 the ARPAnet was moved over to private civilian networks and thus the internet was born. Concurrently TCP/IP had been evolving as a networking protocol. Designed in 1974 by Vinton G. Cerf and Robert E. Kahn and maturing into IPv4 by 1979, TCP/IP was supplied to universities as part of BSD Unix, TCP/IP was adopted in parts of the original ARPANET network from 1980 onwards and became required for all computers connected to the ARPANET in 1983.

Although, till the divestiture, AT&T was prevented by statute from developing Unix commercially, it could, and did license the code to other organizations and corporations. In the mid-1980s, AT&T did enter the commercial market and this led to System V. Modern Unixes, and also Linux, incorporate features and tools from both BSD Unix and System V Unix. From the mid 1980s to the early 1990s Unix powered workstations such as those from Sun (whose flavour of Unix was called Solaris), Silicon Graphics (SGI) with a version of Unix running on their MIPS machines (called IRIX), IBM's Unix (called AIX), and Hewlett Packard's unix (called HP-UX). As well as Workstations running UNIX there were UNIX server machines such as those from e.g. IBM, HP and Sun, to which multiple terminals could be connected. Workstations running Unix could also connect to these servers over ethernet (in those days running over Coax), or in the case of IBM over Token Ring. During the early and mid 80's personal computers running MS-DOS grew in importance. Microsoft itself, founded in 1975 was a small hobbyist oriented business selling a BASIC interpreter for machines such as the Altair 8800. Microsoft's fist operating system,released in 1980, was a variant of Unix acquired from AT&T through a distribution license, which Microsoft called Xenix. The porting of Xenix to various platforms was carried out by the Santa Cruz Operation (SCO). The first version of Microsoft's word processor, Microsoft Word, ran on Xenix. Then Microsoft developed the Windows operating system and became very very successful. Microsoft's foray into C compilers was with the Lattice C compiler in 1983, followed in 1985 by Microsoft's own C compiler, for which windows support was added in 1992, and which was packaged with Visual C (1993).

The first truly open source C compiler, without which the development of Linux would probably not have occurred, was the GNU C compiler. It's "birth" was announced by Richard Stallman in the following email (edited)
"Date: Sun, 22 Mar 87 10:56:56 EST From: rms (Richard M. Stallman)
The GNU C compiler is now available for ftp from the file /u2/emacs/gcc.tar on This includes machine descriptions for vax and sun, 60 pages of documentation on writing machine descriptions (internals.texinfo, internals.dvi and Info file internals).
This also contains the ANSI standard (Nov 86) C preprocessor and 30 pages of reference manual for it.
This compiler compiles itself correctly on the 68020 and did so recently on the vax. It recently compiled Emacs correctly on the 68020, and has also compiled tex-in-C and Kyoto Common Lisp. However, it probably still has numerous bugs that I hope you will find for me.
I will be away for a month, so bugs reported now will not be handled until then ... Free Software Foundation 1000 Mass Ave Cambridge, MA 02138"

Richard Stallman had tried to obtain the details of the Free University Compiler Kit (developed by Andrew Tanenbaum and colleagues) but was, rather brusquely, rebuffed as he noted
" Shortly before beginning the GNU project, I heard about the Free University Compiler Kit, also known as VUCK. (The Dutch word for "free" is written with a V.) This was a compiler designed to handle multiple languages, including C and Pascal, and to support multiple target machines. I wrote to its author asking if GNU could use it.
He responded derisively, stating that the university was free but the compiler was not. I therefore decided that my first program for the GNU project would be a multi-language, multi-platform compiler.
Hoping to avoid the need to write the whole compiler myself, I obtained the source code for the Pastel compiler, which was a multi-platform compiler developed at Lawrence Livermore Lab. It supported, and was written in, an extended version of Pascal, designed to be a system-programming language. I added a C front end, and began porting it to the Motorola 68000 computer. But I had to give that up when I discovered that the compiler needed many megabytes of stack space, and the available 68000 Unix system would only allow 64k.
I then realized that the Pastel compiler functioned by parsing the entire input file into a syntax tree, converting the whole syntax tree into a chain of "instructions", and then generating the whole output file, without ever freeing any storage. At this point, I concluded I would have to write a new compiler from scratch. That new compiler is now known as GCC; none of the Pastel compiler is used in it, but I managed to adapt and use the C front end that I had written."

By a strange series of coincidences, Andrew Tanenbaum was also instrumental in the Minix project, developed as an operating systems to be used for teaching purposes. Linus Torvalds has been thinking of porting Minix to an Intel PC platform, but decided to develop a Unix like operating system instead. The operating system was Linux, and its development depended to a large extent on the GCC compiler and other software developed by the Free Software Foundation, and ... , as they say, the rest is History.

In the early days of GCC various offshoots were forked, and later merged back in. An important offshoot was EGCS (Experimental/Enhanced GNU Compiler System), pronounced Eggs, started in 1997. EGCS development proved to be more creating and energetin than GCC development, so much so that the FSF halted development on their GCC 2.x compiler, blessed EGCS as the official version of GCC and made the EGCS project as the maintainers of GCC, in April 1999. Thence, With the release of GCC 2.95 in July 1999 the two projects were once again united.

The development of C compilers for small 8 bit microcontrollers is associated with companies such as Keil and IAR. These companies started life in the early 1980s and began by developing assemblers for 8 bit microcontrollers such as the 8051. By the early 1990s C compilers for the 8051 and other 8 bit microcontrollers together with rudimentary IDEs were available. These early development environments were fairly expensive, but the cost could be justified as coding in C was considerably more productive than coding in assembler. In practice, many embedded systems projects developed using either the Keil or IAR toolchains consisted of a mixture of C and assembler Keil was purchased by ARM in 2005. With the increasing importance of 32 bit ARM microcontrollers both Keil and IAR developed C compilers for ARM processors. In the last decade or so C++ compiler capabilities were also added. These days open source GCC compilers for "bare metal" programming of "small" ARM processor systems are available, and can be used to develop applications for e.g. ARM Cortex M0, M3 and M4 devices.

Over at First Technology Transfer we have been developing and running embedded C programming courses for the over 20 years and have been witness to the considerable developments in this field. As well as the use of C for programming 8 bit microcontrollersm(8051, Atmel AVR and PIC16/PIC18) microcontrollers we have been involved with teaching the art of embedded C programming for 16 bit microcontrollers such as the PIC24 and dsPIC and 32 bit microcontrollers such as the PIC32 and the ARM CortexM families. More recently, as 32 bit microcontrollers have become feature rich and powerful so there has been a move to develop applications for them in C++.

The original standard for C was that defined by the first edition of the "C Programming Language" book written by Brian Kernigan and Dennis Ritchie. C was standardised as an ANSI standard in 1989 and the second edition of Kernighan and Ritchie's C Programming Language book. This standard went through a number of revisions which led to the next major C standard, the C99 standard, which added various features such as inline functions, new data types such as long long int, an explicit boolean type and a complex type for representing complex numbers. Practically all modern C compilers for embedded systems support the C99 standard. The latest C standard is the C11 standard is, now an ISO standard, ISO/IEC 9899:2011 and the ANSI C11 standard is an ANSI adoption of the corresponding ISO standard.

FTT embedded C programming courses adopt an approach of developing C programs for a PC, using e.g. Microsoft Visual Studio or Eclipse as an IDE, and, additionally, an IDE for the target architecture of the particular family of processors e.g. MPLABX and XC8 for Microchip 8 bit processors, KEIL or IAR for ARM CortexM processors. For embedded Linux C programming courses we use, typically the gcc compiler running on the target platform, or a cross compiler runnning on a PC running Linux either as a native or virtual machine.

Good, gentle, introductions to C for those who are relatively new to programming are

  • C Programming Absolute Beginner's Guide by Greg Perry
  • Beginning Programming with C For Dummies by Dan Gookin
  • C Programming with Arduino by Warwick Smith
C programming with the Arduino is very good if you wish to start programming embedded systems straight away and can serve as a good starting point for exploring C programming for IoT application development.

Hopefully, this has "whetted your appetite for C". In later sections I plan to review books targetted at particular microcontroller families, data structures and algorithms in C and programming using small real time operating systems such as FreeRTOS.

Here is a list of the books that have been mentioned/reviewed together with links to purchase them on Amazon. Remember you can always plan an expedition to North Street Carshalton and have a rummage in the Internet Technical Bookshop. If you are coming from afar then send us an email so that we can make sure there is someone in the shop to greet you.