UNIX: A History and a Memoir
Authors: Brian Kernighan, Brian Kernighan
Overview
This book explores the history of Unix, intertwined with my personal experiences at Bell Labs. It’s aimed at anyone interested in computing or innovation. Beyond the technical details, the book reveals how a two-person project became a cornerstone of modern technology. It covers the key technical features of Unix and its influence on software development, along with the creation of C and other languages. Equally important is the story of Bell Labs, a remarkable environment that fostered groundbreaking research, and whose organizational structure offers lessons for other organizations and software projects. Unix’s history raises a fundamental question: how can we foster an environment that consistently produces innovative, even world-changing results? While Unix’s success seems like a singular event, dependent on numerous factors that may be hard to replicate, there are lessons to be learned from the technical choices, the organizational structure, and the interactions among exceptional people. Unix also demonstrates that good ideas come from individuals, that a rich environment can amplify those good ideas, and that success often requires technical excellence, strong tools, effective communication, collaboration, and luck. I hope that my personal experiences and reflections will help others to understand how an idea can change the world.
Book Outline
1. Bell Labs
Bell Labs, AT&T’s research and development arm, provided a unique environment that fostered innovation. Researchers enjoyed substantial freedom, stable funding, and a problem-rich environment related to improving communications, leading to breakthroughs like the transistor. This environment of open exploration is a key ingredient for successful research. I arrived at Bell Labs in 1967, where my proximity to great minds and a shared “Unix room” proved essential.
Key concept: One Policy, One System, Universal Service - AT&T’s mission statement from 1907 perfectly encapsulates the environment at Bell Labs: a singular focus on telephony and the freedom for researchers to pursue their own interests within that broad scope.
2. Proto-Unix (1969)
Proto-Unix emerged from the experiences of Bell Labs researchers with CTSS and Multics. CTSS introduced time-sharing, while Multics aimed to be an “information utility” but proved too complex. Following Bell Labs’ withdrawal from Multics, Ken Thompson began Unix as a small personal project.
Key concept: At some point I realized that I was three weeks from an operating system. - Ken Thompson’s quote illustrates the rapid development of early Unix.
3. First Edition (1971)
The first edition of Unix ran on a PDP-11 obtained under the guise of supporting patent applications. This resource constraint proved beneficial, forcing efficiency and creativity. The online manual, with its now-familiar format and concise style, was a significant early contribution.
Key concept: BUGS: rm probably should ask whether a read-only file is really to be removed. - This early manual entry reveals the informality and rapid evolution of Unix in its early stages.
4. Sixth Edition (1975)
The 6th edition of Unix marked its expansion beyond Bell Labs. Key features included a hierarchical file system, simple system calls, a user-level shell, pipes for connecting programs, and the grep tool. Software Tools, my book with Bill Plauger, promoted Unix tools philosophy by providing Ratfor implementations for non-Unix systems.
Key concept: The number of Unix installations has grown to 10, with more expected. - From the 2nd edition manual, this quote emphasizes the system’s organic growth and adoption.
5. Seventh Edition (1976-1979)
The 7th edition was portable, running on several architectures. Steve Bourne’s new shell sh offered improved scripting capabilities. Language development flourished with tools like Yacc and Lex; notable outcomes included my work on Eqn, document preparation tools and Awk.
Key concept: It was really with v7 that the system fledged and left the research nest. - Doug McIlroy’s observation marks the 7th edition as Unix’s transition to portability and wider impact.
6. Beyond Research
Unix spread beyond research, finding uses in AT&T’s operations support systems via the Programmer’s Workbench (PWB), which led to advancements like an improved shell by John Mashey and the Source Code Control System (SCCS). The system’s spread into universities under trade-secret licenses fostered a vibrant user community and innovations.
Key concept: The Unix operating system presently is used at 1400 universities and colleges around the world… - From the literature, a snapshot of Unix’s growing influence.
7. Commercialization
AT&T’s divestiture in 1984, while disruptive to Bell Labs, paved the way for Unix’s commercialization by Unix System Laboratories (USL). USL’s push for standardization, culminating in System V Release 4, proved vital, despite various “Unix wars.” Bell Labs also focused on protecting the “Unix” trademark.
Key concept: Unix and C are the ultimate computer viruses. - This quote captures the spirit of Unix’s rapid, self-replicating spread.
8. Descendants
The 7th edition sparked two main Unix lineages: the Berkeley Software Distribution (BSD), fostered by Bill Joy’s work, and AT&T’s System V. BSD, with its liberal licensing, became the basis of systems like FreeBSD, while Unix’s commercial variants ultimately lost ground to Linux.
Key concept: “…from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.” - Darwin’s words perfectly capture the spirit of Unix’s evolution and diverse descendants.
9. Legacy
Unix’s legacy includes technical innovations like the file system, pipes, and text processing tools; it fostered a culture of collaboration, technical management, and a focus on good tools. While an exact repeat is unlikely, the history demonstrates the impact that focused individuals and a good environment can have.
Key concept: Unix swept into popularity with an industry-wide change from central mainframes to autonomous minis. - Ken Thompson’s observation in his Turing Award lecture explains the impact of Unix.
Essential Questions
1. What were the key factors that contributed to the success of Unix?
Unix’s success stemmed from a confluence of factors: the technical brilliance of its creators (Ken Thompson and Dennis Ritchie) who prioritized simplicity and uniformity; the collaborative and open environment at Bell Labs; the freedom and resources afforded to researchers; the timely availability of smaller and more affordable hardware; and a bit of luck. The technical underpinnings, like the hierarchical file system, pipes, and the user-level shell, were essential, but equally important were the organizational dynamics that prioritized technical competence in management, long-term research goals, and a cooperative culture.
2. How did the Bell Labs environment contribute to Unix’s development?
Bell Labs was a unique environment. Researchers had the freedom to pursue their interests, long-term funding shielded them from short-term pressures, and the close-knit research community fostered collaboration and knowledge sharing. The “Unix room” served as a hub for informal discussions and idea generation. The flat organizational structure and technical expertise of managers created a culture of mentorship and peer review, unlike many corporate or academic settings.
3. What are the core principles of the Unix philosophy, and how do they remain relevant today?
The Unix philosophy emphasizes building simple, modular programs that can be combined in powerful ways using pipes. It promotes using tools to automate tasks, a preference for text as the universal data format, and a culture of rapid prototyping and iteration. These principles continue to influence modern software development practices, including in distributed systems and cloud computing.
4. What is the lasting impact and legacy of Unix on modern computing?
Beyond direct descendants like BSD and System V, Unix’s influence is visible in virtually all modern operating systems, including Linux, macOS, iOS, and Android. Its technical ideas, particularly the hierarchical file system and the concept of everything is a file, have been widely adopted. Even more deeply, its philosophy of small, composable tools has shaped software design across languages and platforms.
1. What were the key factors that contributed to the success of Unix?
Unix’s success stemmed from a confluence of factors: the technical brilliance of its creators (Ken Thompson and Dennis Ritchie) who prioritized simplicity and uniformity; the collaborative and open environment at Bell Labs; the freedom and resources afforded to researchers; the timely availability of smaller and more affordable hardware; and a bit of luck. The technical underpinnings, like the hierarchical file system, pipes, and the user-level shell, were essential, but equally important were the organizational dynamics that prioritized technical competence in management, long-term research goals, and a cooperative culture.
2. How did the Bell Labs environment contribute to Unix’s development?
Bell Labs was a unique environment. Researchers had the freedom to pursue their interests, long-term funding shielded them from short-term pressures, and the close-knit research community fostered collaboration and knowledge sharing. The “Unix room” served as a hub for informal discussions and idea generation. The flat organizational structure and technical expertise of managers created a culture of mentorship and peer review, unlike many corporate or academic settings.
3. What are the core principles of the Unix philosophy, and how do they remain relevant today?
The Unix philosophy emphasizes building simple, modular programs that can be combined in powerful ways using pipes. It promotes using tools to automate tasks, a preference for text as the universal data format, and a culture of rapid prototyping and iteration. These principles continue to influence modern software development practices, including in distributed systems and cloud computing.
4. What is the lasting impact and legacy of Unix on modern computing?
Beyond direct descendants like BSD and System V, Unix’s influence is visible in virtually all modern operating systems, including Linux, macOS, iOS, and Android. Its technical ideas, particularly the hierarchical file system and the concept of everything is a file, have been widely adopted. Even more deeply, its philosophy of small, composable tools has shaped software design across languages and platforms.
Key Takeaways
1. Modular Design
Unix popularized the modular approach by designing small, specialized programs that could be combined using pipes. This approach allows for flexibility, reusability, and easier debugging compared to large monolithic programs. This principle has become deeply embedded in successful software design, particularly with the use of libraries and frameworks.
Practical Application:
When developing a complex AI model or a software library, applying the modular design principle, and providing useful tools for combining the modules in a variety of ways, is crucial. For instance, creating an AI-powered recommendation engine can involve separate modules for data preprocessing, feature extraction, model training, and prediction generation. Clear interfaces and tools that can connect these modules enhance flexibility and reusability, allowing AI engineers to quickly experiment with various architectures.
2. Collaboration and Openness
Unix’s development and adoption were greatly facilitated by open communication and collaboration, both within Bell Labs and among universities. The shared “Unix room” was instrumental in fostering this. Tools like UUCP and later email and the Internet further facilitated communication and knowledge sharing within the Unix community. Tools for collaborative work have become indispensable.
Practical Application:
In the realm of AI research, where collaboration is paramount, sharing research code and data is essential. Tools like Git, inspired by Unix’s focus on collaboration and version control with tools like SCCS, enable version control, distributed development, and seamless collaboration among AI researchers. Reproducibility of results and rapid iteration are essential for AI research, and tools for managing code and data are crucial.
3. Iterative Development
Unix followed an iterative design process, starting small and then evolving based on user feedback and practical experience. This is essential for dealing with complex problems where it’s not possible to get everything right up front. The ‘worse is better’ approach applies to operating systems too: iterate quickly, fix obvious flaws, get lots of user feedback, then iterate again, as fast as possible.
Practical Application:
In the world of AI, where complex systems and models are constantly developed, an effective technique for creating reliable software is to iterate rapidly and focus on continual improvement rather than striving for perfection on the first attempt. This iterative process involves prototyping, testing, and refining models and algorithms, borrowing heavily from Unix’s cultural approach.
4. Language and Tool Development
Unix nurtured an ecosystem of programming languages and tools. The development of tools like Yacc and Lex greatly simplified the creation of specialized languages, or domain-specific languages (DSLs). This allowed for tailoring languages to specific tasks, improving expressiveness and efficiency. DSLs are increasingly important in many parts of AI.
Practical Application:
In developing and deploying AI models, the language used for defining models can have significant effects on efficiency and ease of use. Creating domain-specific languages (DSLs) for specific AI tasks can simplify development and increase performance by tailoring the language to match how AI engineers think and by creating efficient implementations. For example, when designing neural networks for image recognition, a DSL like Keras can make it easier for engineers to define network structures and hyperparameters using a natural and intuitive way.
1. Modular Design
Unix popularized the modular approach by designing small, specialized programs that could be combined using pipes. This approach allows for flexibility, reusability, and easier debugging compared to large monolithic programs. This principle has become deeply embedded in successful software design, particularly with the use of libraries and frameworks.
Practical Application:
When developing a complex AI model or a software library, applying the modular design principle, and providing useful tools for combining the modules in a variety of ways, is crucial. For instance, creating an AI-powered recommendation engine can involve separate modules for data preprocessing, feature extraction, model training, and prediction generation. Clear interfaces and tools that can connect these modules enhance flexibility and reusability, allowing AI engineers to quickly experiment with various architectures.
2. Collaboration and Openness
Unix’s development and adoption were greatly facilitated by open communication and collaboration, both within Bell Labs and among universities. The shared “Unix room” was instrumental in fostering this. Tools like UUCP and later email and the Internet further facilitated communication and knowledge sharing within the Unix community. Tools for collaborative work have become indispensable.
Practical Application:
In the realm of AI research, where collaboration is paramount, sharing research code and data is essential. Tools like Git, inspired by Unix’s focus on collaboration and version control with tools like SCCS, enable version control, distributed development, and seamless collaboration among AI researchers. Reproducibility of results and rapid iteration are essential for AI research, and tools for managing code and data are crucial.
3. Iterative Development
Unix followed an iterative design process, starting small and then evolving based on user feedback and practical experience. This is essential for dealing with complex problems where it’s not possible to get everything right up front. The ‘worse is better’ approach applies to operating systems too: iterate quickly, fix obvious flaws, get lots of user feedback, then iterate again, as fast as possible.
Practical Application:
In the world of AI, where complex systems and models are constantly developed, an effective technique for creating reliable software is to iterate rapidly and focus on continual improvement rather than striving for perfection on the first attempt. This iterative process involves prototyping, testing, and refining models and algorithms, borrowing heavily from Unix’s cultural approach.
4. Language and Tool Development
Unix nurtured an ecosystem of programming languages and tools. The development of tools like Yacc and Lex greatly simplified the creation of specialized languages, or domain-specific languages (DSLs). This allowed for tailoring languages to specific tasks, improving expressiveness and efficiency. DSLs are increasingly important in many parts of AI.
Practical Application:
In developing and deploying AI models, the language used for defining models can have significant effects on efficiency and ease of use. Creating domain-specific languages (DSLs) for specific AI tasks can simplify development and increase performance by tailoring the language to match how AI engineers think and by creating efficient implementations. For example, when designing neural networks for image recognition, a DSL like Keras can make it easier for engineers to define network structures and hyperparameters using a natural and intuitive way.
Memorable Quotes
Preface. 9
“One of the comforting things about old memories is their tendency to take on a rosy glow. The memory fixes on what was good and what lasted, and on the joy of helping to create the improvements that made life better.”
2. Proto-Unix (1969). 27
“At some point I realized that I was three weeks from an operating system.”
2. Proto-Unix (1969). 32
“We give them a dictionary and grammar rules, and we say, ‘Kid, you’re now a great programmer.’”
4. Sixth Edition (1975). 67
“Anything you have to do repeatedly may be ripe for automation.”
5. Seventh Edition (1976-1979). 125
“You can’t trust code that you did not totally create yourself.”
Preface. 9
“One of the comforting things about old memories is their tendency to take on a rosy glow. The memory fixes on what was good and what lasted, and on the joy of helping to create the improvements that made life better.”
2. Proto-Unix (1969). 27
“At some point I realized that I was three weeks from an operating system.”
2. Proto-Unix (1969). 32
“We give them a dictionary and grammar rules, and we say, ‘Kid, you’re now a great programmer.’”
4. Sixth Edition (1975). 67
“Anything you have to do repeatedly may be ripe for automation.”
5. Seventh Edition (1976-1979). 125
“You can’t trust code that you did not totally create yourself.”
Comparative Analysis
Compared to Peter Salus’ “A Quarter Century of Unix,” which provides a broader historical context of the computing landscape, my book offers a more personal and intimate perspective, focusing on the culture and people at Bell Labs. Like “The Unix Programming Environment” that I co-authored with Rob Pike, this book emphasizes the practical philosophy of Unix, but adds the historical context of its development. Unlike “The Design and Evolution of C++” by Bjarne Stroustrup, which delves into the technical intricacies of a specific language, I focus on the broader language-creation ecosystem enabled by Unix and tools like Yacc.
Reflection
While Unix’s story is often romanticized, it’s important to acknowledge the limitations of the initial systems, especially with respect to security. The early Unix systems were small and simple, designed for a small community of trusted users, so security concerns were not paramount. Today’s systems are vastly more complex and operate in a hostile, adversarial environment, so robust security is essential. Furthermore, the freedom and resources provided to researchers at Bell Labs, a consequence of the regulatory environment and AT&T’s quasi-monopoly status, are difficult to replicate. Despite these caveats, Unix’s emphasis on modularity, simplicity, and the power of tools remain highly relevant today. Its focus on developer productivity and on tools that empower programmers continues to inspire modern software design and development practices in the age of cloud computing, mobile apps, and artificial intelligence.
Flashcards
What was Bell Labs?
AT&T’s research and development arm, known for its open research environment and key innovations like the transistor.
Who were the key figures in the development of Unix?
Ken Thompson, Dennis Ritchie, and Doug McIlroy.
What is a pipe in Unix?
A mechanism that connects the output of one program to the input of another.
What is grep?
A pattern-searching program.
What is Sed?
A stream editor used for text transformations.
What is Awk?
A pattern-action language for text processing and data extraction.
What is Make?
A tool for automating software builds based on file dependencies.
What is BSD?
A portable operating system developed at Berkeley.
What is Linux?
An open-source operating system inspired by Minix.
What was Bell Labs?
AT&T’s research and development arm, known for its open research environment and key innovations like the transistor.
Who were the key figures in the development of Unix?
Ken Thompson, Dennis Ritchie, and Doug McIlroy.
What is a pipe in Unix?
A mechanism that connects the output of one program to the input of another.
What is grep?
A pattern-searching program.
What is Sed?
A stream editor used for text transformations.
What is Awk?
A pattern-action language for text processing and data extraction.
What is Make?
A tool for automating software builds based on file dependencies.
What is BSD?
A portable operating system developed at Berkeley.
What is Linux?
An open-source operating system inspired by Minix.