Software People

From Eli's Software Encyclopedia
Revision as of 04:24, June 20, 2025 by Eli (talk | contribs)
Software People
Title
Software People: An Insider's Look at the Personal Computer Software Industry
Author
Publisher
Simon & Schuster
Inc.
Release Date
1985
Genre
History
ISBN
0-671-50971-3
Format
Hardcover
Country
United States of America
Language
English

Description from the Book Jacket

Entrepreneurs, eccentrics, prodigies, and flim-flam men-the software people emerged from hobby shops and backwoods cabins to create the explosive, lucrative business of personal-computer software publishing. Now, insider Douglas G. Carlston chronicles the birth of the industry and tells the tales of the small group of extraordinary young people who started out pursuing an obscure hobby and ended up spearheading a new campaign in the information revolution.

Back in the glory days of the late '70s and early '80s, bright young entrepreneurs like Doug Carlston shook the change out of their families' and friends' pockets to finance start-up companies, and found themselves millionaires overnight- the Gold Rush era. But just as suddenly, the software publishing business went bust- the Shakeout-leaving a lot of people wondering what happened.

Doug Carlston, whose own company, Broderbund Software, Inc., is thriving, has been in the middle of this volatile industry from the start. In Software People, he takes a personal look at the programmers, adventurers, and home-brew tinkerers who provided the fuel for the personal-computer revolution:

  • Teenagers Bill Gates and Paul Allen started a small software company that topped the $100 million sales mark before Gates turned thirty.
  • Paul Lutus, hermit, self-educated dropout, and former street person, built himself a primitive cabin in Southern Oregon where he wrote the program that earned him over a million dollars in royalties every year of the Gold Rush- nearly $4 million in his best year.
  • Terry Bradley and Jerry Jewell were running a Radio Shack franchise in Sacramento, California, in the spring of 1980, when a confident young programmer named Nasir Gebelli walked in with a couple of microcomputer programs. Less than a year later, Gebelli's royalties were in six figures and Bradley and Jewell were in charge of a multimillion-dollar company that was on its way to dominating the industry. By 1984 that company was bankrupt.

There was Bill Budge, the young programmer whose publisher promoted him as a pop star; Joyce Hakansson, self-proclaimed "denmother" of the educational software developers; Margot Tommervik, who won $15,000 on a television game show and used it to start the computer magazine boom; Ken and Roberta Williams, who created microcomputer fantasy games, hired a crew of teenage programmers, and built an empire in the Sierra foothills; and the Japanese software people, whose software boom is just beginning. Carlston tells about today's survivors, and analyzes the factors that led to their success, discusses what happened to those who didn't survive the Shakeout, and speculates on what might have brought some of them down.

Harvard graduate Doug Carlston abandoned his law career in 1979 because he was making almost as much money- and having a lot more fun- writing his Galactic Saga programs. In 1980, at age 32, Doug enlisted his younger brother Gary to start a software publishing business out of their Eugene, Oregon, home. By 1983, their annual gross sales topped $10 million. Along with their sister Cathy, Doug and Gary continue to run a healthy company in an industry littered with the remains of former software giants. Software People is the real story behind today's digital boom and bust, and the people who made it happen, written by one of the drama's most prominent players.

Copyright

To:
Gary and Cathy Carlston,
without whom Broderbund would never have existed.

Copyright (c) 1985 by Doug Carlston

All rights reserved including the right of reproduction in whole or in part in any form.

Design by Shirley Covington Jacket Design by Lorraine Louie

Library of Congress Cataloging-in-Production Data
Carlston, Douglas G.

Software People
Includes index.
1. Computer software industry- United States.

I. Title
HD9696.C63U51487 1985 338,4'700536'0973 85-18355 ISBN 0-671-50971-3

Acknowledgments

I want to thank John Brockman, who conceived of this book, and Frank Schwartz, who believed in the project and gave me the confidence to tackle the job. I'd also like to thank Howard Rheingold, who helped me write much of this book and whose knowledge of his craft made the rest as intelligible as it is. Without his gentle prodding, I never would have succeeded in committing anything to paper. I'd like to extend my appreciation to my assistant, Janetta Shanks, whose humor and organizational skills helped bring the book together. Many software industry people contributed their time and let me pick their brains for the book: Paul Lutus, Ken Williams, Ed Auer, Margot Tommervik, Bill Budge, Messrs. Son, Hoshi, Kudo, and Gunji, and Bill Baker. Thank you all. My gratitude extends to all the Broderbunders who kept our company on such a steady course that I could spend weekends working on the book: Gary Carlston and Cathy Carlston, Ed Bernstein, Bill McDonagh, Stu Berman, Debbie Hipple, Jane Risser, Jon Loveless, Brian Eheler, Brian Lee, Al Sonntag, Allan Kausch, and all the others. The three outside groups whom we most credit for our early success are Dave Wagman and Bob Leff of Softsel, Al and Margot Tommervik of Softalk magazine, and Minoru Nakazawa of Star Craft. Finally, I thank my wife, Mary, for her patience and support on this project during our first year of marriage.

Introduction

When my brother and I started Broderbund Software in 1980, we had no idea that it would become one of the largest home computer software companies in the world. In fact, we originally entered the software business by accident. We had no business plan, no scheme to make our fortunes. We were just trying to come up with a way to pay our next month's rent.

In most ways, we were unlikely candidates for the roles we assumed. Neither of us had any business experience to speak of, neither of us knew very much about computers, and neither of us lived anywhere near those centers of innovation where so many high-technology firms were springing up. Before we started our company, I was a lawyer, practicing my trade in rural Maine. My brother Gary had just returned from Sweden, where he had spent five years working as a coach for a women's basketball team. He was now living in Oregon, where, after a stint as field director for the March of Dimes, he became involved in an importing business that proved to be unsuccessful.

What we had was computer fever-a malady we shared with all the other entrepreneurs who were forming similar companies. Of the two of us, I was the one who was more heavily stricken. Programming can be an addiction- those who get drawn into it often forget jobs, family, and friends in their absorption with these fascinating machines. My own addiction began in 1978 when I took the fateful step of entering a Radio Shack store in Waterville, Maine, in order to take a closer look at the computer that was displayed in the window. I ended up walking out with a TRS-80 Model 1 tucked underneath my arm. My life has not been the same since.

I wasn't a complete stranger to computers, however. In the mid-1960s, as a teenager, I had taken a summer course on computers at Northwestern University. In the following years I found a few programming jobs, first at the University of Iowa in Iowa City (where my family lived when I was in high school) and later at the Aiken Computation Lab of Harvard University, where I was an undergraduate. My fellow programming fanatics and I used to jam chewing gum into the locks on the doors of the chemistry building just so we could sneak in after midnight and play with the big IBM 1620. But college was an exciting place for me, and there were lots of other distractions, so my interest in computers waned. By the time I saw that computer in the Radio Shack window ten years later, I had forgotten everything I once knew about computers-except how much fun they were.

When I obtained my TRS-80, I was a lawyer in Newport, Maine, a small town that had fewer than 5000 residents and was close to my parents' summer place. Having grown tired of practicing corporate law in Chicago, I had retreated to Maine in 1977, opened a law practice with a friend, and divided my time between lawyering, building houses, and skiing. All of this had been fun at first, but rural life was starting to bore me, and I began looking for a distraction.

I bought the computer because I thought it would be fun to use. I also had a notion that I could computerize a lot of the routine work around my law office. At that time I knew of a lawyer in Northern Maine who traveled around the area in a Winnebago camper that was fully equipped as an office and that included a microcomputer system; he was able to crank out wills, trusts, and deeds in a fraction of the time normally required for such work and at a fraction of the cost that most lawyers charged. Our law office needed to be able to compete with him, I thought. We needed to computerize.

I know now that those thoughts were purely rationalizations. As I started to play with the computer, all my old fascination with the technology returned. These tiny machines could do almost as much as the huge, expensive models I had first encountered! My interest in the law business declined, and I spent more and more of my spare time learning the tricks of programming. I did eventually write the legal software for my firm, but we never really used it. At the same time I also wrote a game. Although I saw it as a weekend amusement, that game was actually the beginning of the end of my law career.

The game was a simulation-a science fiction fantasy called Galactic Empire. I wrote the program in a couple of weekends for my personal enjoyment. And when I say a couple of weekends, I mean a programming marathon that started Friday afternoon and wrapped around to Monday morning, relieved only by occasional catnaps and snacks. When the game was finished, it turned out to be a lot of fun to play, and so I started adding more and more features to it, until I finally ran out of space in the computer's memory. Even these powerful new microcomputers can hold only so much in their electronic memories before they cry uncle and refuse to run a program.

I then began to look for ways to make my programming code more compact so that I could add just one more feature. Like minuets and mathematical equations, programs should be elegant as well as formally correct, and it takes a very skilled, experienced programmer to tinker with a program without destroying its elegance. My own program structure ended up looking like a tangled ball of spaghetti, but only another programmer would have noticed how ungainly it was. For those who were just playing the game, the programming code was invisible.

Imagine you're playing the game. What you see on the screen is the cockpit of a spaceship. You're at the helm, where you see, in the upper left corner of the screen, a window that looks out into interstellar space. If a planet comes into view, your onboard computer identifies it for you. Below the viewport you see a computer screen (after all, spaceships are bound to have computers on board, so I included a "computer within the computer"), and off to the right is your fleet detail that tells you at a glance how many fighters, transports, and scout ships you have at your disposal. The objective of the game is to conquer a cluster of twenty planets that had unlikely names like "Javiny" and "Ootsi." To accomplish your goal within the 1000 years that you are allotted (people live longer in the future) requires considerable logistic sophistication.

In fact, the game was a fascinating intellectual exercise. It was far more fun to play than I had ever thought it would be. What made it so special was that I had never been able to play anything like it before. Without computers it would have been impossible to create such a simulation. In other words, a whole new area of entertainment had just been created. It's hard to describe how excited I felt. I love games of all sorts. The idea that the world might suddenly be filled with hundreds of brand-new games was unbelievably thrilling.

At that time, however, I owned very little software, and Radio Shack carried almost none. It was then that the store manager told me about a wonderful chess program published by a Boston-based company called Personal Software; eventually, I drove all the way to Boston to get a copy. But one program simply wasn't enough to satisfy my appetite. I then discovered 80-NW, a home-printed four-page computer magazine that was dedicated to users of the TRS-80. I bought a subscription to the magazine and was treated every other month to an ever thicker book filled with programming tips and advertisements for microcomputer-related products. Dozens of programs were available, and I could have bankrupted myself in a week simply by ordering all the software that struck my fancy. So one day I thought up a scheme to get more games.

I sent a copy of Galactic Empire to four companies that had runs ads in 80-NW. Would they be interested in publishing my program?, I asked. And, by the way, would they consider sending me their line of software, gratis, as part of the deal? To me, creating a game and trading it for other games promised to be an intensely satisfying transaction. It would be great fun if I could get away with it.

The scheme succeeded beyond my wildest dreams. Scott Adams of Adventure International sent me his whole line of adventure games from Florida. Art Canfil of Cybernautics sent me his program Taipan from San Francisco. The Software Exchange (TSE) in Milford, New Hampshire, sent me a huge pile of games. And in return, everybody wanted to publish my program, a procedure that was a little different in the late 1970s than it is now. Then, programs were stored on cassette tape- a fairly slow and clumsy medium that was eventually replaced by floppy disks. Moreover, in those days no one thought to ask for exclusive publishing rights to a program, so with my permission Adventure International, TSE, and Cybernautics all published my game, with varying degrees of success. I was in seventh heaven, having never imagined that writing software could be profitable!

Compared with the effort of writing software, the activity of selling it was not only profitable but also remarkably fast-paced, right from the beginning. Four days after I sent Galactic Empire to Scott Adams in Florida, I received a call from Scott's wife, Alexis. Yes, they loved the program, she said. In fact, they already had orders for it, and if I would give her oral permission to publish, they would start shipping later that day. The contract would follow later. Scott and Alexis were as good as their word. I received my first royalty check, for a couple of hundred dollars, two weeks later. I was astonished. People were actually paying me to have fun!

I started to take the whole programming business a little more seriously, and in time my obsession with the TRS-80 began to destroy my law practice. I couldn't help myself. Even in the courtroom, I'd suddenly find myself thinking of a more efficient way to write a particular piece of code, or I'd realize that there was a logical defect in a program that wasn't doing what I wanted it to do. My legal briefs ended up with bits and pieces of programming code scribbled in the margins, and I could hardly wait to get back to the office to key my ideas into the machine to see whether they worked.

Finally, in October 1979, I dissolved my law practice. I was having a lot more fun writing computer games than I was drawing up wills. The fact that I was also making a modest but steadily increasing income from my programming efforts had something to do with my decision, but at the time it wasn't at all clear that this was a prudent career move. I had no idea whether the freelance programming business was going to continue to be financially viable, but I abandoned my law career anyway because the microcomputer software world drew me in a way that I found irresistible.

It immediately struck me- when I realized just how possible it was to make a living at my kind of programming- that I had an opportunity to lead an altogether different way of life. It took me a while to accept that I had stumbled upon such a beautiful loophole in the rules of life, but once I did I knew that my job for the immediate future was to create fantasies and translate them into computer programs. If you think that sounds a lot more like play than work, you know how I reacted to the prospect of this new career. The kind of fascinating sci-fi sagas that had occupied my spare hours- flying interstellar craft to a thousand strange planets- was now my profession as well as my avocation.

It didn't take long for my new career to change the way I lived my life. Something very different from everything I had previously planned for myself suddenly became possible, and I was still young enough to be tempted by the prospect of a romantic journey into an uncertain future. So I went along with the opportunity to become an electronic-age vagabond. I didn't need an office or more equipment than I could fit into the trunk of my car. In fact, I could write my fantasy programs from wherever I could plug in my computer, so I started traveling across America. With my dog in the front seat and my computer and a few other possessions in the back seat, I headed in the general direction of Oregon. I stopped along the way to visit friends and relatives, play with the computer, and shed the three years of Harvard Law School and four years of the juridical practice. I was free, for the first time in years.

Three thousand miles later, I arrived in Eugene, Oregon, where my brother Gary lived. He had given up his job with the March of Dimes and was now investing all his time and energy in his ill-fated importing business. One day when he was feeling particularly broke, I suggested that he try to sell some of my programs; after all, everyone else seemed to be making money doing it. By this time, I had followed up my first simulation program with another, Galactic Trader; eventually, I finished four programs in the Galactic Saga series. On the morning of February 20, 1980, Gary called a fellow named Ray Daly, owner of The Program Store in Washington, D.C., and talked him into ordering $300 worth of our products. We then officially formed a company, and, using a name from one of my science fiction simulations, we called ourselves Broderbund. A software company was born.

That evening, Gary and I had a celebration dinner at a local restaurant to fortify us for the arduous task of filling Daly's order. Computer software was still sold in the form of cassette tapes, and so we spent most of the next day with three cassette tape recorders, dozens of cassettes, plastic packing bags, and staplers strewn all over the living room floor as we frantically tried to copy enough programs to fill the order on time. Our efforts were successful. We packed the cassettes into the plastic bags and sent them off. At the top of each bag was our (hastily produced) business card and a punched hole that retailers used to hang the bags on the pegboard racks that passed for point-of-purchase displays in those days.

Things moved very quickly from that point. We had some financial problems in our first year, but by the third year of operation, we had moved from Eugene to San Rafael, California, a community in Marin County, twenty minutes north of the Golden Gate Bridge. We had hired more than forty people to help us and were occupying a fairly large building. Our company was selling millions of dollars' worth of software annually. Software pioneers who had been only names in magazines or the heroes of hobbyist legends were now my colleagues, competitors, and, in some cases, friends.

Broderbund is now around the tenth largest software publisher in the microcomputer industry, while the software industry itself has become a significant slice of the gross national product. Indeed, the software business, and particularly software people, seems to have attracted a disproportionate amount of attention from the general public.

Most people are not particularly interested in investment bankers or manufacturers of pantyhose. But the readers of magazines as different as Time and Ms., Fortune and Playboy, Forbes and Cosmopolitan have been eagerly following the tales of Adam Osborne and Steve Jobs, Bill Gates and Mitch Kapor. Perhaps the sudden fascination with microcomputer Wunderkinder is a result of the youth of these entrepreneurs. Perhaps it is because people sense (or are told) that this mysterious, intangible, and volatile new commodity promises to have an unprecedented impact on America and the world. It also could be due to an interest in Horatio Alger stories like ours and those of people who have made a lot more money than Gary and I have.

Or perhaps it is because people are always intrigued by extraordinary characters who do what they do because they love doing it and, almost unintentionally, end up changing the face of our society in the process. I know that I continue to be fascinated with software people, many of whom happen to be my relatives, my friends, my employees, and my business associates. Some of them think and behave in ways that have to be labeled eccentric. Some of them are no more eccentric than an insurance salesman. Many of them are extremely bright, even geniuses, when it comes to thinking up the intricate codes that cause computers to serve as video games, software tutors, or electronic accountants. Some of them know little about programming but a great deal about marketing products. All of these people share, in varying degrees, an obsession with personal computer software- an obsession that in fact led to the birth and phenomenal growth of an entire industry.

The primary reason for this book is to tell the stories of the remarkable people who created and have been the driving force behind the microcomputer software industry. Although I make no claim to being an official historian of the industry, I hope to convey something of its unique nature by describing the people I know or know about who have played major roles in the evolution of the software business from its infancy to its coming of age.

Time is highly compressed in the software business because computer technology changes so quickly. The period during which the events in this book took place, from the age of the first hobbyist computer, the Altair, to the present era of Apple and IBM, lasted only around ten years. The first, strictly hobbyist phase of the personal computer industry began in 1975, when a few intensely devoted hobbyists began to put together their first Altair kits. By 1978, the hobbyists were putting together companies to sell the first generation of home computers that didn't have to be assembled from kits. By 1980, the newborn video game and personal computer companies had grown at a dizzying speed into a billion-dollar industry.

For a time, it seemed that all who dipped their pans in the software stream came up with a few nuggets, if not an entire lode. The software gold rush began in 1980, the first of several years in which teenage programmers and software entrepreneurs who were still in their early twenties made personal fortunes.

Until the middle of 1983, companies continued to proliferate and prosper, riding the unprecedented annual rates of growth in the computer industry. Then the personal computer market began to level off, and people who had been making fortunes for years suddenly found themselves losing fortunes in months. This period, which extended through 1984, is the era generally referred to as "the shakeout." In examining the shakeout, which was at least as important to the history of the software industry as was the gold rush (albeit less glamorous), I have attempted to point out some of the underlying causes of several of the business disasters that occurred during that period.

In this book are the stories of people who lived the events of these various software eras. Although the individual stories overlap, the overall order of their presentation is roughly chronological, progressing from the hobbyist days through the early years, the gold rush period, and beyond the shakeout to the present.

Each of the people profiled in this book helped to shape the extraordinary character of the software industry. Intense, volatile, creative, lucrative, adventurous, and regularly eccentric- it is an industry that, in terms of its spirit and complexity, is quite unlike most other contemporary American businesses. Moreover, it is dominated not by a single type of individual but by a variety of people. The hobbyist-programmers might have started the whole thing, but the sudden blossoming of the home computer software industry came about as a result of the efforts of many different kinds of people who played very different roles-programmers, entrepreneurs, publishers, developers, and marketers. And the nature of their products varied just as widely across different software genres that addressed very different markets, from games to business productivity tools to educational programs. If anything, the software community is an eclectic collection of different interests, linked only by the personal computer that makes the market possible.

Some of the people I've written about here were included because of their importance to the software industry. Some people are included because they exemplify a certain kind of software legend. Some of them are my friends or acquaintances whose stories are closely related to mine. There are many stories I did not tell, including those of many friends. To them I apologize- my intent was more to give a feeling for the industry than to provide the definitive history.

Many of the principals in the industry, whom I did not know personally when the events described here happened, were interviewed for the purposes of this book. In other cases, where I did not interview the subject in person, I have done my best to sift the most likely true stories from the vast and contradictory lore of software legends, which are already becoming embellished with each retelling as the age of the Altair recedes into history.

Bill Gates and and Paul Allen are foremost among the people who are included here. Their names cannot be omitted from any history of the software Industry-partially because of the continued success of their company, Microsoft, and partially because they were present at the beginning of the microcomputer era, during the pioneering Altair days. Gates was nineteen years old when he left Harvard to join Allen in New Mexico to create software for the first hobbyist microcomputer. Less than ten years later, the company they founded topped $100 million in sales.

Other major figures in the founding of the microcomputer software industry include Dan Bricklin and Bob Frankston, who came up with the first microcomputer spreadsheet, VisiCalc, and Dan Fylstra and Peter Jennings, whose company, VisiCorp, marketed Bricklin and Frankston's product, making it the first phenomenal best-seller in the micro market. Their program made the four principals millionaires, established personal computers in the business world, and ensured the early success of Apple because so many business people bought Apples to run VisiCalc.

Without programmers, there would be no software industry. The legendary programmers of the gold-rush years number in the dozens. I chose three for this book. One of them, Bill Budge, is not only an example of the new breed of programmer as fine artisan, but also an old friend of mine. His programs Raster Blaster and Pinball Construction Set were milestones in software history, acclaimed for their artistry as well as the sheer dollar volume of their sales.

Then there's Paul Lutus, the fabled programmer-hermit of the Oregon wilderness. Paul exemplifies the legend of the eccentric character with a knack for programming who made himself a millionaire by writing a best-selling program while living in his backwoods cabin.

The third programmer profiled here is another one of those who can't be excluded from any history of the software industry, although I don't know him personally. John Draper didn't make himself a millionaire like Lutus, or create a masterpiece of programming elegance like Bill Budge, but he was perhaps the first of the maverick techno-wizards because of his past career as a colorful anti-hero known as "Captain Crunch," king of the "phone phreaks" (until he was busted for playing with the phone company's switching network without paying for the privilege). Years later, a program he wrote led to one of the first and biggest entrepreneurial coups of the software gold rush.

Indeed, aggressive entrepreneurship has been one of the major forces behind the phenomenal growth of the software industry. Most of the early software entrepreneurs were programmers who discovered that they could make a lot of money marketing their own programs. Others saw the opportunity to make fortunes by marketing other people's products. Bob Leff and Dave Wagman, for example, founded a software distribution business that went from a shoestring budget to a $150 million annually in a little over four years. They distributed Broderbund's products when we first started publishing, and they even bought us disks when we couldn't afford to fill their orders.

Unlike me, Bob and Dave are the kind of successful entrepreneurs who take advantage of all the high-life perks of their occupation- from the champagne they gave away to their suppliers to the matched Porsches they bought for themselves. Like all the most successful entrepreneurs in the software Wild West show, they also work twelve to eighteen hours a day.

My friend Ken Williams has a different kind of entrepreneurial story altogether. Still living out his own brand of fantasy up in the Sierra foothills, he and his wife/partner, Roberta, and their tribe of well-heeled programmers make up the single largest component of the workforce in Oakhurst, California, and are the dominant cultural element in a territory where the last big action was the gold rush of 1849. Less than four years after Roberta convinced Ken to program the adventure-fantasy game she had designed, their company, "Sierra Online" (now called "Sierra"), reached a level of more than $6 million in annual sales.

Ken and Roberta's company was one of the first and most successful of the software publishers-companies like Microsoft, Broderbund, Sirius, Synapse, and a dozen others profiled here, that concentrated on marketing products created by inhouse or freelance programmers. A small number of these companies, most of them associated with Apple-oriented products, most of them located in California, were, along with Broderbund, part of a loose group of friendly competitors I've called the Brotherhood.

Then there are the developers, who came along a little later than the first publishers. Developers come up with the ideas for new programs and hire programmers to create these products, which will then be sold or licensed to software publishers for marketing. Some of these developers, like Joyce Hakansson, concentrated on a specialized segment of the industry, such as educational software. Others specialized in games or productivity software. Some developers were either started by or backed by venture capitalists- groups of investors who often guided (and occasionally took over) the management of the companies they invested in.

Not all the software entrepreneurs were programmers, distributors, publishers, or developers. There were those like Al and Margot Tommervik, founders of Softalk magazine, whose focus was on the personal computer culture. In the case of the Tommerviks, their market and their community encompassed that segment of the computer subculture who were devoted to the use of Apple computers. The Tommerviks, who started their first magazine with the money Margot won on a television game show, were among the more prominent casualties of the software shakeout.

Because the computer revolution is a worldwide phenomenon, and because my own company in particular has had a long history of dealing with Japanese software companies, I have also written about Japan's software community and industry. Even as the companies mentioned in the foregoing paragraphs struggled to create an industry in the United States, a parallel struggle was taking place in Japan. There, hobbyists followed the development of microcomputers with every bit as much interest and enthusiasm as their American counterparts. And in response to the growing market in Japan for products made with or for microcomputers, a small group of Japanese entrepreneurs emerged to build a fledgling industry in their country.

Computer addiction knows no national boundaries, it seems, and it appears that Japanese hobbyists are no more immune than Americans to the lure of entrepreneurship. Consider Masaaki Hoshi, who founded I/O, now the largest microcomputer magazine in Japan, strictly as a part-time enterprise to help him keep in touch with other hobbyists. As so many of the cottage entrepreneurs did here in the United States, he started small in 1976 and got caught up in a wholly unexpected wave of consumer enthusiasm for what had, until then, been interesting only to a small group of hobbyists.

Or consider Akio Gunji and Kazuhiko "Kay" Nishi, who worked with Hoshi until they saw an opportunity to compete with him. In 1977, they started their own magazine, ASCII, which they then used as a base to turn their operation into an empire that included one of the largest software publishers and distributors in Japan, and several of the most successful magazines in the industry as well.

One of the people who occasionally wrote articles for I/O and ASCII was Yuji Kudo, an amateur photographer and avid collector of model steam locomotives. When he started his own software company, he named it after his favorite locomotive and turned Hudson Soft into Japan's largest microcomputer software publisher. Another successful entrepreneur in Japan's software world was Jung-Eui Son, a software distributor and publisher who started his own magazine when his competitor's magazines wouldn't take his advertising. His distribution company, Soft Bank, which he started when he was a teenager, ended up as the largest microcomputer software distributor in Japan.

More than a few other software people have not yet been introduced, although their stories are told in later chapters: Among these are Mary Carol Smith and Don Fudge of Avant Garde Productions; Nasir Gebelli, the first superstar programmer; Scott Adams of Adventure International, my own first publisher, whose business has been eclipsed by those companies founded by many of his former employees; and Bill Baker, the twenty-one-year-old deal maker who built a company on the basis of one of John Draper's creations, then sold the company for $10 million on the eve of the shakeout.

There are still more whose stories we'll encounter along the way. For now, we'll start at the beginning of the personal computer era, way back in the "ancient" days of the mid-1970s, when the first microcomputer kits were assembled by many of the people who were to become the leaders of today's microcomputer software industry.

The Birth of an Industry

The Age of the Altair

In the beginning was the IBM mainframe. The microworld was formless and devoid of software. On the first day, Intel brought forth the 4004, fashioned from the Silicon of the Valley. MITS said 'Let there be Altair,' and the microcomputer was created. Then Microsoft, created in the image of Gates, begat the microcomputer software industry. The hackers were the first prophets, and the homebrewers were the patriarchs, but the children of Intel remained in bondage until the Woz led them to the promised Apple with one bite missing...

The idea of a Scripture of the Microcult is not entirely a joke. The origins of the microcomputer industry are indeed spoken of in quasi-mythological tones by many people in the personal computer culture, even though the events upon which the myths are based occurred no more than a decade ago. Gates and the Woz are real people who happen to have created multimillion-dollar companies before they were thirty. MITS was a real place that symbolizes to computer freaks what Kitty Hawk means to aviation fanatics. And there is a definite evangelical streak to be found beneath the entrepreneurial surface of the founders of the earliest microcomputer businesses.

Many of my colleagues started out as hobbyists and ended up as industrialists, and although their fortunes have diverged in a dozen unlikely directions over the past ten years, many share a reverent nostalgia for the 1975-76 era. the Age of the Altair. Indeed, for many, that era represents a kind of magical time, but in reality it was directly experienced by only a rare few survivors who still tell the tales of Altair to the multitudes of recent converts.

Altair was the name of the first widely used hobbyist computer based on the new microprocessor technology. It was a do-it-yourself kit that preceded the factory-assembled Apples and Commodores and IBM PCs. Compared with today's personal computers, the Altair was computationally puny and unbelievably difficult to program. But it inspired a group of people who believed it was possible to have computers for their own personal use. They were solitary, garage-based computer tinkerers who called their avocation "home-brewing" and who were all surprised when they discovered how many others were fiddling with Altairs, or patiently waiting for Altair parts to be shipped to them.

But these Altair users were more than the forerunners of the personal computer enthusiasts who were to buy Apples and PCs five or six years later. They were the spearhead of the microcomputer revolution, and although some of them even knew it, none of that first wave could have predicted how much money, power, and attention would come their way over the course of a decade.

Out of that hobbyist network of a few thousand people, a dozen or so ended up creating the personal computer technology we see today in millions of homes and offices. Some of those people are still tinkering- happily or not. A few of them are personally worth tens or hundreds of millions of dollars. Not all of them are on speaking terms with one another any longer, but in the beginning, all were unified by their one common interest: the microcomputer. In fact, the microcomputer industry is the only major one in the world that started out as a club for teenage enthusiasts.

Both the hardware and software branches of the industry were directly influenced by these enthusiasts who, as relatively small groups of hobbyists, gathered in the mid-1970s to share ideas. Of those groups, two are most notable. The first included the now-legendary amateur computer builders in the San Francisco Bay Area who called themselves the Homebrew Computer Club and who ended up being the founding fathers of the microcomputer hardware industry. The second group consisted of a pair of very young but decidedly professional programmer-entrepreneurs from Seattle. It was their creation that was the beginning of what has become today's microcomputer software industry.

Until recently, the better known story behind the microcomputer revolution has been on the hardware side of things, and yet, as a mathematically minded programmer would say, hardware is necessary for making a computer, but it isn't sufficient. You need more than circuitry to make a computer do anything useful. You need software -coded instructions that turn a computer into a word processor or spreadsheet, telecommunication terminal or video game. The lesser-known chapters of this story, then, are about software, and they're chapters where people like me step into the scenario.

Still, it wasn't until the waning days of the homebrew era that software people started to become important. When the Age of Altair gave way to the eras of Atari and Apple, the software epoch had only just begun. Its dawning marks the point in history where the physical components of computers became less important than the human ability to think of new things to do with these machines. Nevertheless, no book about microcomputer software people can exclude the Altair story or a discussion of microcomputer hardware, nor can it leave out an early tribute to Ed Roberts, Paul Allen and Bill Gates, Steve Wozniak, and the other legendary homebrewers whose hobby unexpectedly gave birth to a new and unprecedented kind of industry.

Although I know several of these founding fathers, my own roots in the industry do not go as far back as the Altair Era. I was a young attorney working in Chicago when the era began and the first microcomputer kits were marketed by a company located between a laundromat and a massage parlor in a shopping center in Albuquerque. I didn't write my first line of microcomputer code until three years after the now-famous January 1975 issue of Popular Electronics told of a wondrous new toy for electronic enthusiasts. That new toy was an affordable computer, and the company that sold it by mail order was called Micro Instrumentation and Telemetry Systems- fondly remembered as "MITS."

A brief discussion about the technology and history of the microprocessor and microcomputer is necessary in order to explain what was so important about the mail-order microcomputers sold in the 1970s by a small company that no longer exists. Computers are not as hard to understand as they have been made out to be. You might need to know esoteric details of electronic circuit design if you intend to build a computer, and a healthy knowledge of how the computer operates is helpful if you want to successfully program one. But you need to know only a few simple, general principles to understand how computers work.

The first principle has more to do with economics than electronics, and it is also the hardest to believe: Computers get smaller, more powerful, and less expensive as time passes. Computers and software change very quickly because the electronic technology on which computers are based also changes quickly. Most of these changes are triggered by the continuing miniaturization of computer components. In fact, the computer revolution has been strongly influenced by the electronic miniaturization revolution, the importance of which lies in the relationship between the size of electronic components and the efficiency of computers built from those components.

To understand the power of software, you need to know only two essential facts about computing machinery. First, a computer is a machine for interpreting instructions, especially instructions that tell it how to imitate other machines. Second, both the machine that interprets the instructions and the instructions themselves are built from very simple elements- electrical switches that can be turned on and off. In essence, a computer is a collection of switches, which can be vacuum tubes, transistors, integrated circuits, or any other technology that can create a network of devices that are either on or off.

The real power of a computer lies in how and in what patterns you turn those switches on and off- the software. When you want to create switching patterns that can accomplish complex tasks like calculating the results of physics equations or storing census statistics or creating pictures on a screen, you need lots of switches- very fast ones.

The faster those switches can operate, and the more switches you can put into a computer, the more things you can do with the computer. And smaller switches can operate at higher speeds than large ones. At the heart of the microprocessor revolution is the fact that the switches got smaller and faster. As the speed of the computer's fundamental elements increased, so did the sheer informational volume-the "memory," or the amount of coded on-and-off information that the computer could store. Over the four decades since the first electronic digital computer was invented, computing power increased more than a millionfold.

But the speed and number of switches are not the only aspects of computers that are affected by the size of the basic components. The heat generated by all that switching is another aspect, and it is also a major problem in computer design. Big components tend to get hot, especially when packed together in large numbers in an enclosed space, and that means that electronic computers are limited in their size and power. The first computers were built out of vacuum tubes, which were very hot and very big. No computer with a capacity that surpassed a certain threshold amount of computing power could be devised by using vacuum tubes. Such machines would melt before anything substantially useful could be done with them.

Not long after the first tube computers reached their heat limit, however, a discovery in the field of subatomic physics made considerably more efficient switching elements possible. In the late 1940s the transistor was invented, and that meant that computers could be constructed from elements that were much smaller and much cooler than the old tube technology. As a consequence, "smarter" computers could be built-computers that could follow more complex strings of instructions.

Besides their cool and rapid manner of operating, transistors had another advantage over the old computer elements- they were cheaper than tubes. Indeed, a paradoxical phenomenon has governed the evolution of computer technology: As computer components became smaller, cooler, and more powerful, they also became cheaper. They also happened to come along at the perfect time. The pressure to develop electronics and computer technology to their furthest limits, and substantial financial resources to support large-scale research and development efforts, were provided by two of the most powerful institutions in history- the Defense Department of the United States government and IBM. In the years that followed World War II, this fortuitous combination of fundamental scientific breakthroughs, breakneck engineering, and unprecedented economic benefits eventually made computer technology an integral part of all the levels of society.

Computers are valuable because they multiply the power of the human brain, in the way levers are useful tools for multiplying the power of the human arm. A lever, however, is only for moving a large object, but a computer can do much more than that. It can command machines to move large objects, or it can perform mathematical calculations, or it can put words on a screen. It is an all-purpose tool that empowers anyone who can afford to use it. In fact, the sudden empowerment made possible by available computers has happened so fast that our society has barely begun to feel its impact. As computers have become cheaper, in terms of computations per dollar, the computer-using population has expanded dramatically.

In the 1950s, computers more powerful than those used previously by the Defense Department were being installed by large institutions like banks and corporations. By the 1970s, computers of even greater power were being used by small businesses. Now, in the 1980s, middle-class households can afford computers that only the national government could afford to build thirty years ago, and it is reasonable to expect that people thirty years from now will be able to afford computers that are as powerful as the mightiest supercomputers being used today.

While computers kept getting smaller, more powerful, and cheaper throughout the 1950s and 1960s, they were still too expensive to be made accessible to the exclusive use of one person. By 1970, a computer still couldn't be called an affordable device for individuals, but its cost had fallen from millions of dollars to thousands. By the early and mid-1970s, another series of breakthroughs in miniaturization was underway. Researchers for electronic companies had already discovered ways to put thousands of components into ultra-miniature circuits known as "integrated circuits"- or, as they came to be known, "chips." These chips made all kinds of electronic devices possible-satellite communications, cheap color televisions, stereos, and radios, as well as personal computers.

In 1969, engineers at Intel Corporation designed a chip that had all the switching elements needed for a computer's central processing unit-the historic 4004 chip. In 1972, a somewhat more powerful version- the 8008-was developed by the same engineers. While the 4004 could handle information only in 4-bit chunks, the 8008 was a true 8-bit processor, and this boosted the device's potential applications from the realm of calculators to the world of true computers.

The 4004 and the 8008 were the first microprocessors- electronic devices capable of processing information- but they were not quite computers, which are information-processing machines that must possess specific capabilities. The 8008 had the basic information-processing capability and the built-in "language" of instructions that could enable it to become a computer, but other devices had to be connected to the chip in order for people to actually create and use programs. This wasn't a simple matter; you had to have a pretty advanced knowledge of electronics to assemble the different parts of the computer.

Still, a subtle but crucial shift in the course of events was triggered by these devices, although only a few people recognized their significance when they were created. In fact, neither the world at large nor the electronics world in particular heralded the arrival of the Intel 4004. Intel was just looking for a new kind of chip that the company could sell to all the other companies that make consumer devices out of microelectronic chips.

In any case, at this point in the story we are still talking about hardware expertise, but now we are beginning to talk about computer designers, not just electronic component manufacturers, for the microprocessor was the first electronic computer technology cheap enough to make it possible for ordinary people to afford relatively powerful computers (although, as we shall soon see, the first people to use these homebrew computers were far from ordinary).

The microprocessor has often been called "a computer on a chip," which is slightly misleading, since it isn't possible to use one of these chips as a real working computer without connecting it to additional electronic equipment. That is where MITS and the homebrewers came in. Ed Roberts, the owner of MITS, entered the annals of computer legend when he decided to build a kit for putting a microprocessor together with all the other necessary components. Little did he know that there was a vast, previously unknown market for these devices. Hundreds of young computer enthusiasts across the country were fiercely determined to get their hands on real working personal computers. The year was 1974.

Roberts hadn't started out to be a computer entrepreneur. He had originally wanted to be a doctor, and last I heard, about a year ago, he actually was in medical school in Florida. But he received electronics training in the Air Force, and in the late 1960s he started his own company and sold radio equipment to model airplane hobbyists- hence the name "Microelectronic Instrumentation and Telemetry Systems."

Before the Altair kit came along, MITS faced some rocky times, especially when Roberts decided to get into the calculator business at precisely the wrong time to compete with the Texas Instruments juggernaut. But he moved on to microprocessors and shopped around for a better chip than the 8008. The problem with the 8008 was the way its instruction set hampered the efforts of programmers. He finally purchased a quantity of Intel's successor to that chip, the 8080, for $75 apiece. The price was right, and the 8080 instruction set was far more amenable to computer software design.

A man named Les Solomon, who was the technical editor of Popular Electronics magazine, heard about Roberts's devices and convinced MITS to provide the original working model for a cover article on the first affordable computer kit. In January 1975, the article appeared. The mail-order kits sold for $397, and Roberts was hoping for a few dozen orders so that he could keep the business going. The first day he checked his mail after the article appeared, he found more orders than he had hoped for in a year.

Nobody could have predicted how many people were eager to spend $400 on a computer kit. Roberts was swamped. Within a few weeks, MITS' bank balance went from nearly half a million dollars in the red to a quarter of a million in the black. But his small company couldn't ship the kits fast enough to satisfy some of Roberts's most fanatical early customers. Some of them actually went to Albuquerque, prepared to camp out on his doorstep until their Altair was ready!

The origin of the name Altair is also a microcomputer legend. According to Roberts and Solomon, they were speculating on the phone about possible names for the kit before the article appeared. Solomon asked his daughter, who was watching "Star Trek" at the time, what Roberts ought to call the device. She replied that in that evening's episode the starship Enterprise was heading for a star called Altair. So Roberts put the world Altair on the cover of the machine, in those hard-to-read "computer letters" that were considered "futuristic" in the 1970s.

Still, programming an Altair was an almost inconceivable tedious business at first. Nowadays, people program by using their keyboards and video screens to write the symbols for a program in what is known as a "high-level language" (for example, BASIC). After they write their program, they enter it into the computer, and then another program known as a compiler or interpreter translates the high-level program to the kind of language-the patterns of on-and-off impulses- that the machine understands. Back in 1975, an Altair owner had to create even the simplest program by laboriously turning switches on and off by hand.

The lack of a high-level language was a big handicap, but there were other problems with the Altair besides that. You can't run complicated software unless the hardware has a certain information-handling capacity. The memory of the earliest model was infinitesmal. The first Altair held 256 bytes of information- approximately 2000 on or off impulses. By contrast, most home computers today have 256 kilobytes of information- a memory that is a thousand times as large as the Altair's. There was no way to feed information to the processor other than by setting the switches, one a time, by hand-and if you made a mistake you had to start from the beginning.

A volunteer army of garage tinkerers set out to solve these problems by creating software, memory expansion devices, and input-output devices. From all accounts, and from the many innovations that came forth from the homebrew reign, it was an open, enthusiastic, brilliant, intense, esoteric, fun, and exuberant effort- the finest days of the hacker tradition.

Although all the homebrewers of the mid-1970s started out as orthodox members of the Altair cult, they quickly developed their own patriarchs, their own legends, their own shrines. The homebrew mythology started shortly after the birth of the Altair and centered on northern California, rather than New Mexico. The Apple empire, the ill-fated but revolutionary Osborne Computer Corporation, and almost all of the earliest microcomputer-related companies trace their origins back to an anarchistic, ragtag group of computer zealots, the Homebrew Computer Club, who started meeting in the auditorium of Stanford University's Linear Accelerator building in the spring of 1975.

Lee Felsenstein, a veteran of the Free Speech Movement, a former reporter for The Berkeley Barb, a lifelong electronics freak, and the "anarche" who presided over the early Homebrew Computer Club meetings, later became famous, if not rich, by designing the Osborne I- the first "portable" personal computer. Steve Wozniak, barely out of high school, was another homebrewer whose attempts to outdo the Altair led to a company that grew from a garage to a billion-dollar operation in a few swift years. Dozens of other members of the Homebrew Computer Club started their own companies in the post-Altair, pre-Apple era-with varying degrees of success.

Important as they might have been to the personal computer revolution, the entrepreneurial success and engineering brilliance of the homebrewers is not as directly relevant to our story as is the history of two other Altair enthusiasts who launched the microcomputer software industry. These two homebrewers-turned-industrialists were teenagers at the time, as were many of the early Altair fans. They were far from inexperienced in either the computer world or the business arena, however, and their creations marked the transition of microcomputer programming from a freewheeling amateur affair to a full-fledged business enterprise.

Paul Allen and Bill Gates were their names, and when the Altair came along they were already professional programmers. In the 1960s, when they first met at Seattle's exclusive Lakeside School and began their long, profitable partnership, Paul Allen was fifteen and Bill Gates was thirteen. They rode bicycles to work years before they owned a company car. Despite their youth, their ability to find the flaws in adult-sized minicomputer programs got them their first job.

A company in Seattle had just received a new minicomputer from Digital Equipment Corporation (known as DEC). The Seattle firm, a company called the Computer Center Corporation (known to the young hackers as "C cubed"), made a deal with DEC: As long as C cubed could uncover bugs in the new computer's system software, it wouldn't have to start paying for its use of the computer. This was a mutually profitable arrangement, since both the inventor and the user of the computer had a practical need to track down and eliminate all the programming errors that could cause the system to "crash" and stop working. Gates and Allen were therefore employed by C cubed, which offered them an equivalent deal: These exceptionally bright kids would be allowed to play with the computer, free of charge, for as long as they could come up with new bugs. They did their job so well that they were soon earning real pay.

After several months, DEC, fearing that these hot young bug hunters might find flaws in the system indefinitely, backed down on its original arrangement with C cubed and demanded payment for use of the computer. Meanwhile, Gates had grown so adept at the black art of computer crashing that he had learned to defeat the security procedures of several well-known computer systems. Crashing systems was something of an accomplishment back then, since it proved that one could out-think the people who had designed the system security. The practice had not yet earned the notoriety it found fifteen years later, when the movie WarGames brought the stereotype of the mischievous hacker to public awareness.

By 1971, when Bill Gates had finished his sophomore year of high school and Paul Allen had graduated, their reputation as bug hunters had spread. They were soon hired by a major company, TRW, which was in need of troubleshooters who knew how to find software flaws in exactly the same kind of DEC system that Allen and Gates knew so well. TRW had been contracted to develop the complex and critically important computer system that would control the electrical power generated by the Bonneville dam on the Columbia River. The reliability standards for such a system were, understandably, extremely high. The young troubleshooters ended up making a significant contribution to the project.

Eager to capitalize on their expertise, Gates and Allen developed a computer program for analyzing the traffic-flow data collected by those rubber tubes that transportation departments stretch across highways. Under the company name Traf-O-Data, the young entrepreneurs tried to sell their service to various municipalities; these efforts failed to make them rich. By this time, Bill was in his last year of high school while Paul was studying at Washington State. They got together again during their summer vacation, when they secured summer jobs at another computer giant- Honeywell.

Meanwhile, the miniaturization revolution was proceeding at such a rapid pace that Allen and Gates both knew that affordable computers were going to arrive sooner or later. When they did, these two young entrepreneurs wanted to get in on the ground floor of what they suspected would be a major revolution within the computer industry. In fact, Allen tried to convince Gates that they should write a BASIC interpreter for Intel's 8008 microprocessor, but Gates felt that the chip's built-in language was too clumsy.

Their entrepreneurial speculations were based on their knowledge of the way microprocessor chips were set up to receive instructions from programmers. When a microprocessor chip is built, certain circuits are put into it to perform elementary information-processing operations. One such circuit would take two inputs and add them together, for example. Another such circuit could perform an elementary logical operation such as opening a circuit when either one of the two inputs was on. These wired-in elementary commands are known as the instruction set of that chip, and they constitute the "words" of any higher languages that communicate instructions to that kind of chip. Each chip has its own instruction set, but all instruction sets are written in the same code, known as machine language, which is based on an "alphabet" of on-and-off switches.

A BASIC interpreter for the Intel 8008, then, would be a program written in the language of the chip's hardwired instruction set. The purpose of such a program is to make life easier for programmers. In the machine's native language of ones and zeroes (the numerical equivalent of the on-or-off states of the switching elements) it would take literally dozens of machine instructions to perform a simple arithmetic operation like multiplying two times three. First, the numbers each have to be assigned specific positions in the processor's memory. Then the built-in multiplication procedure has to be directed to first multiply the contents of one memory location by the contents of another memory location and then to put the product of the operation in yet another specific memory location.

Programming anything of significant complexity with such a nit-picking and endlessly specific code is like writing a novel with alphabet blocks. The interpreter program would enable the BASIC programmer to write a command like "PRINT 2*3," then enter it into the computer along with the BASIC interpreter program, which would translate the command into the proper 8008 instructions, apply it to the data, and return the correct answer on the screen or printout.

After Allen and Gates decided to pass up the 8008, Allen quit Washington State to continue working with Honeywell and moved east to Honeywell's Boston office. In the fall of 1974, Gates was at Harvard, just across the Charles River. The next time they contemplated the state of microprocessor-based software was the afternoon Allen saw the historic Altair issue of Popular Electronics at a Harvard Square newsstand. This new computer kit was based on the Intel 8080 microprocessor, an improved version of the 8008 chip they had originally rejected as the target for a commercial software effort. This was it- their opportunity to expand their entrepreneurial venture, begun with Traf-O-Data, into the software business.

Although they didn't even have an Altair, they called Ed Roberts and asked if he was interested in a BASIC interpreter. Roberts told them that several other programmers had already made the same proposal, and as far as he was concerned he would buy the first BASIC that would actually run on an Altair. Gates and Allen promised him delivery of their interpreter in three weeks. They then programmed a larger computer to simulate an Altair, and by using the simulation rather than the actual hardware, they created the BASIC interpreter they promised. It took twice as long as they had expected- not an unusual turn of events in the software business.

Six weeks after their conversation with Roberts, Allen finally delivered a paper tape that contained, in a code consisting of a pattern of holes punched into the paper, his and Gates's version of BASIC for the Altair. This was before disk drives were cheap enough for small computers, so punched paper tape (as ancient as that technology sounds) had to be used to feed the program to the computer. After a tape reader converted the pattern of holes into on and off impulses that automatically set the Altair's memory-location switches, the Altair was ready to receive BASIC commands. The remarkable thing about that tape is that it worked the first time-a virtually unheard-of event in the bug-prone world of software. It was an especially noteworthy feat, considering the fact that they didn't have an Altair to test it on!

Within a month, Roberts had hired Paul Allen away from Honeywell. Bill Gates stayed at Harvard while Paul Allen went to New Mexico to become MITS' software director. In the meantime, they had renamed Traf-O-Data Micro-Soft, later to be shortened to Microsoft. Then Gates took a leave of absence from Harvard (in fact he never returned) and moved to Albuquerque. He and Allen hired a programmer by the name of Monte Davidoff to help them enhance their BASIC interpreter. MITS was in ferment. The Altair freaks were already drifting in to check on their orders and become involved in the operation. Gates and Allen, perfectionists since their days as consultants for TRW, didn't exactly mesh with Roberts, who was strictly a seat-of-the-pants guy. But they were on to something hot.

Gates wrote the disk operating system program for the first Altair disk drives in a famous marathon session in February 1976. According to one of the legends that have since become a prerequisite for homebrew software immortality, he apparently sequestered himself in a motel room with a computer, pencils, and notebooks until the task was complete. The same story is told of Wozniak and the creation of Apple's first disk operating system. In any case, 1976 was also the year that Roberts's competitors began to appear, in the form of companies like IMSAI, Processor Technology, and Cromemco- all founded by members of the Homebrew Computer Club. Given the increasing competition, Allen and Gates's BASIC interpreter became a major selling point for the Altair.

But in the meantime, it had also become a source of conflict between MITS, Microsoft, and the homebrewers. A lot of people objected to the idea of selling software- especially for $500, which was the price of Microsoft BASIC. Some people refused to pay for it. Others made copies of the punched paper tape and distributed them for free. A $500 BASIC interpreter was simply a philosophical affront to the inner circles of the personal computer cult. The hacker tradition went back to the early 1960s, when similarly obsessed computer enthusiasts at MIT created the software for the kind of interactive computers that led to personal computers. And hacker tradition dictated that software was supposed to be free.

Gates didn't agree. In February 1976, at the same time he was creating the Altair disk operating system, he also wrote a now-infamous manifesto titled "Open Letter to Hobbyists," which was published in the Homebrew Computer Club newsletter. He pointed out that "as the majority of hobbyists must be aware, most of you steal your software. Hardware must be paid for, but software is something to share. Who cares if the people who worked on it get paid?" A lot of the homebrewers didn't react kindly to this bluntly worded accusation by a nineteen-year-old programmer who was clearly interested in becoming a successful entrepreneur.

In retrospect, that rift between two factions of an obscure group of highly technical young amateurs appears to have been the birth pang of an infant industry. But very few people, other than maybe Paul Allen and Bill Gates, were thinking of industries or fantasizing about software empires back in 1976 and 1977. Most were content at the time to keep their hobby- and whatever differences of opinion they might have about that hobby-to themselves. Their arguments were still in the family, for they were all part of a single community. But in a very short time, these amateurs would find that they were no longer obscure.

From VisiCalc to Activision: The Beginning of the Software Industry

The year 1977 marked the beginning of an entirely new era in the computer revolution. In the spring of 1977, both the Apple II and the Commodore PET were born in the same place- the first West Coast Computer Faire. The Faire was an unexpectedly popular, totally enthusiastic convergence and celebration by thousands of disciples of the new doctrine of the personal computer. The man responsible for convening the Faire was one of the most mischievous and skillful social organizers of the personal computer culture, a somewhat older, definitely more radical associate of the homebrewers- Jim Warren.

Warren already had a reputation as a flamboyant social organizer by the time he got hooked on computing and decided that everybody else ought to get in on the thrill. Before that he had been chairman of the mathematics department at a Catholic women's college on the fringe of Silicon Valley. After five years, he was fired in the midst of a scandal. It seems that the college administration was not amused by the reports from Time, the BBC, and Playboy that had documented the wild parties that he was having at his home. In Warren's words, these parties "... were rather sedate by any common standards, except people didn't have to have clothes on."

Following his dismissal from the college, he looked around for new employment and happened upon a job at Stanford University that required programming skills. He was not particularly qualified for the position. But he learned quickly while on the job, and like many others, he found that he had a talent and an enthusiasm for the art and science of programming. He was soon involved in a crusade to "bring computing to the people." He edited an enthusiasts' magazine, one with a social conscience and sense of humor as well as pages of technical jargon. Its title, Dr. Dobb's Journal of Computer Calisthenics and Orthodontia, was a programmers' joke based on the magazine's subtitle, Running Light without Overbyte, which referred to the limited memory capacity of early microcomputers.

By 1976, computer hobbyists were beginning to attend national shows in New Jersey, Denver, and Detroit. Warren, who had attended several of these shows, thought that the organizers had not done justice to the spirit of the movement and that the shows lacked what he felt was an essential element: a party-like atmosphere. One thing they know about out in California is how to throw a party, and with Warren's own slightly scandalous credentials as a socializer, he became determined to throw his own bash for computer enthusiasts. He saw his show as a potentially joyous event, a celebration of the liberating advent of personal computing. He envisioned organizing something like the Renaissance Faires that at the time were immensely popular in the San Francisco Bay Area. He then formed a company, signed up exhibitors at $300 apiece, and, renting the San Francisco Civic Auditorium for the event, convened the first West Coast Computer Faire in April 1977.

The event was a milestone, remembered with reverence and glee by those who participated- a Woodstock for the personal computer generation. People who would be running multimillion-dollar operations a few years later were manning their booths and talking about their products. The Apple II made its debut. Companies were born. Introductions were made. Visionaries traded crazy predictions about how there would soon be millions instead of thousands of people in this personal computer community.

In the aftermath of the Faire, and by one of those twists of fate that are called "history" long after they happen, Microsoft grew overwhelmingly in its importance while MITS disappeared. Roberts's company, whose Altair had been worth $13 million in gross sales in 1976, was sold in May 1977 to a company called Pertec for what Roberts said was "essentially 6 million dollars." Gates and Allen decided that their BASIC interpreter would not be part of the acquisition- thus signaling the end of their relationship with Ed Roberts. Pertec management brought in some three-piece-suit types who immediately alienated the fanatically devoted MITS employees. The Altair was facing stiff competition in the market at just the time MITS' original crew was departing.

By late 1977, the first wave of home computers like the Altair, the Sol, and the IMSAI were sharing the marketplace with Commodore's PET and a model put together by a couple of homebrewers, both named Steve, who decided to give their elegantly garage-engineered product the unlikely name of Apple. The hardware hobbyists were building an industry right in the backyard of the mainframe giants. And the software entrepreneurs were coming up with programs for those new, small, affordable computers-programs that would quickly change the way offices operated, businesses were run, and teenagers entertained themselves.

Microsoft in particular continued to turn out different versions of BASIC as the different brands of personal computer grew in number, diversity, and power. Allen and Gates eventually moved to Bellevue, an affluent, woodsy suburb of Seattle, to expand a company that would in time become a major force in the software business.

From 1975 through 1977, personal computers were mostly for hobbyists who were willing to learn how to program in BASIC or even get down to circuit boards and soldering irons to put their equipment together. Homebrew computers were exciting only to that tiny minority who loved to work with their hands as well as their heads. The far vaster audience of people, however, didn't care how computers were put together. They became owners primarily because they were drawn to what computers could do for them. And by the late 1970s, personal computers could do more than ever before, thanks mainly to the development of video displays and the development of better software. It was, in other words, the creation of spreadsheets, word processors, and adventure games -programs- in conjunction with improved hardware technology that turned computers into consumer items.

In the late 1970s, prior to my own entry into the software industry, three different programs virtually created the mass market for personal computers. First, there was VisiCalc, which convinced businesspeople and everyone who worked with numbers that the potential pragmatic gains of being able to experiment with financial projections and to prepare business presentations made buying an Apple a worthwhile investment. Then WordStar showed typists, writers, and anyone who worked with words that a small computer could enable them to accomplish tasks that were either tedious or impossible to do with an old-fashioned typewriter. Finally, Adventure Land and other "fantasy simulation" games showed people of all ages that the small computer could be used as a fantastic toy and a means of amplifying the power of imagination. These games also advanced the notion that computers could be used in the home.

In many ways, VisiCalc created the Apple empire and gave it both a toehold in the small-business market for computers and a fanatic following in the hobbyist world. This program, the first electronic spreadsheet for microcomputers, was written in 1978 and 1979 by Bob Frankston and Dan Bricklin, who years before had been roommates at MIT. It originated when Bricklin, as a graduate student at Harvard Business School, was given a particularly onerous homework assignment- to prepare spreadsheet analyses of various hypothetical companies. The professor gave the students a mass of financial data and lists of assumptions, and the students had to make intricately cross-referenced calculations in order to project those companies' fortunes at specified times in the future.

Bricklin, a former professional programmer, suspected that the tedium involved in preparing these projections would be enormously relieved if their required calculations and cross-indexing were programmed for a microcomputer to do. He then took his idea to Frankston, an even more accomplished programmer. They worked on the program together, and the result, in October 1978, was the first prototype of VisiCalc. Their enormously successful software development company, Software Arts, was born.

As it turns out, one of Bricklin's classmates at the Business School had his own small software publishing firm. His name was Dan Fylstra, and he had recently started Personal Software as a means of selling the Micro Chess program written by his partner, Peter Jennings. When Fylstra saw Bricklin and Frankston's creation, he invited them to a business meeting at a Chinese restaurant in Cambridge. There the cofounders of Software Arts agreed to produce an Apple-compatible version of VisiCalc that would be marketed by Fylstra and Jennings's firm.

Several years later, the two companies found themselves in a bitter conflict, the substance of which will be discussed later. But in the meantime, the early 1980s were exceptionally lucrative years for all parties involved. Apple, although initially skeptical about the usefulness of VisiCalc, ended up selling millions of computers to businesspeople who wanted to perform spreadsheet analysis and were interested in a tool that could help with their hardest ask-predicting the future. Frankston and Bricklin made close to $10 million as a result of that agreement in a Chinese restaurant. And microcomputer software as a whole was becoming a far bigger business every week, largely as a result of their program.

At about the same time that the spreadsheet entrepreneurs were programming in Massachusetts, another enormously influential and lucrative program was created in California. Seymour Rubinstein, one of the earliest hardware entrepreneurs, had recently left the ill-fated IMSAI corporation (which folded in 1979, a victim of its marketing strategy, its erratic hardware, and competition from more advanced machines) to start his own company, MicroPro. Sometime after he published a filing program and a primitive word-processing program, he received several letters from people asking for a more powerful text editing program, like Michael Shrayer's Electric Pencil, which was the first commercial microcomputer word processing program.

Rubinstein, like any smart entrepreneur, responded to the needs of the market. He commissioned a top-notch programmer by the name of Rob Barnaby to create a word-processing program called WordStar. The program was released in mid-1979. It became an instant hit and was near the top of every best-seller chart until early 1985.

By 1983, MicroPro's revenues had exceeded $50 million, making it, by some people's estimates, the largest microcomputer software company in the world. But the question of whose estimates one wants to believe is a notorious trigger for heated debate among software people. Trying to arrange companies in rank order is a tricky exercise, especially in such a volatile industry. When this book was written MicroPro and Microsoft were running neck-and-neck for the number one spot, but MicroPro was facing internal problems and Microsoft was still coming on strong. By the time this book is published, the top slot on the charts may well have changed.

But regardless of this ongoing conflict over estimates, there is one point on which there has never been any dispute between the market gurus and prognosticators: In everybody's opinion, WordStar was an all-time runaway money-maker. The success of this program demonstrated that a single piece of intellectual property, created by a single individual, could bring in revenues of hundreds of millions of dollars in just a few years.

Word processing was a revolution unto itself. Indeed, because of word-processing software, a major change in society took place in a very short period of time. In 1979, only the largest business offices had computers- and those were always enclosed in special rooms, somewhere off in the data processing department, and were tended and administered to by computer experts. But some years later, there were in many offices more computers than typewriters, and you would be hard-pressed to find many businesses of reasonable size that didn't have at least one word processor. In the years following WordStar's introduction, that venerable institution, the typing pool, began to disappear; and in general, the way people in offices and universities deal with text was irrevocably altered.

By the early 1980s, the business world was adopting word processing on a large scale. By the mid-1980s, with the advent of low-cost word-processing software like Broderbund's Bank Street Writer, word processing has begun to saturate the home market. And by the late 1980s, the typewriter may be on its way to the museum along with other obsolete curiosities like slide rules.

The third category of microcomputer software that took off in the late 1970s was geared for entertainment and was not business-oriented at all. The arrival of the first simulation games provided unexpected and, to many, far more exciting reasons for using a computer. A simulation, in terms of a computer, refers to a program that presents a realistic model of a process, an object, a series of events, a real world, or a hypothetical universe. Mathematical representations underlie all simulations, which is why computers are necessary to keep track of even a simple simulated universe. The model itself is presented on a screen or paper printout in the form of words and numbers or even in graphically displayed patterns.

Computer simulation techniques were originally developed for serious purposes like designing airplanes. That they could also be used as the basis for a particularly addicting variety of recreation- the fantasy simulation- was a fact initially recognized by a gleeful and talented bunch of young men who ran amok on MIT's computers more than twenty years ago. In fact, most software people agree that the term hacker originated with those early programmers, who seem to have discovered everything from the "serious" computerized gaming of chess programs to the purely recreational field of computer graphics games.

But these programmers, many of them MIT dropouts, had nevertheless been hired by MIT's computer development laboratories because of their prodigious programming ability. Not only was the software for the first interactive computer systems their creation, but artificial intelligence research, the construction of time-sharing software systems, and other landmarks in software development and computer science all came out of the hackers' headquarters in Cambridge's Technology Square. They were renowned for their accomplishments in advancing computer research and notorious for their irreverent attitudes and often unorthodox lifestyles.

And of course, while doing their more purposive work, they also delighted in finding ways to use computers to play games of various kinds. In 1961, for example, they got their hands on the first minicomputer, the PDP-1, which had a large, round video display screen, and they started playing around with different programs that would demonstrate the machine's graphic capabilities. One hacker, Stephen Russell, who was known to his cohorts as "Slug," came up with the idea of simulating spaceships on the screen. He had been reading science fiction novels about space battles, so he included a means of controlling the spaceship's flight patterns and a means of shooting down other ships. Meanwhile, another hacker by the name of Peter Samson created a realistic depiction of a star field as a background to the spaceships. The game Spacewar was born. Years later, Spacewar, as well as some other games that were originally designed by hackers at MIT and elsewhere, would form the basis of billion-dollar industries.

Although MIT was clearly the mother-temple of hacking, the Stanford Artificial Intelligence Laboratory (known as SAIL) became a West Coast shrine of the art. By the 1970s, a kind of party line for computers called the ARPAnet (developed by researchers for the Defense Department's Advanced Research Projects Agency) linked SAIL computers to those at MIT and hundreds of other institutions. This multi-computer long-distance network made it possible for programmers at various institutions to send messages and even programs to one another.

Games, of course, were one of the largest categories of community-shared programs on the ARPAnet. One day, a SAIL hacker and Stanford student by the name of Don Woods found a particularly intriguing game on the network- a non-graphic simulation of an adventure through a cavern-world that was replete with treasure, trolls, dragons, maidens, and flying horses. Woods contacted the game's creator, Will Crowther, because he wanted to add some refinements to the program. Crowther, a computer scientist in Palo Alto, had concocted the game as a means of amusing his children, and having once been a spelunker, he decided to create a fantasy in the form of an exploration of linked caves. He was working for a company that was connected to the ARPAnet, so he had simply posted the game on the network for other hackers to enjoy. Woods, however, came to the game with a sophisticated perspective, since he had been involved in role-playing board games like "Dungeons and Dragons." When Woods's refinements were added to Crowther's original version, the infamous game called Adventure was born.

I say that Adventure was infamous because research leaders and computer system managers very quickly discovered that many programmers were spending hours and days- even weeks and months!- on the game. It must be pointed out that as there were no graphics on the original Adventure, the game offered a different kind of thrill from the sensory addiction of the video games that would surface a few years later. Besides the fantasy aspect, it, and games like it, provided an intellectual challenge that caused otherwise rational programmers to spend days working their way through it.

The game is a puzzle in the form of a trip through a series of caves. Each cave poses a problem that must be solved before the player can progress to the next cave. As the player solves each puzzle and moves through the various caves, a narrative unfolds, describing the fantasy adventure. The player makes moves by typing in commands like "KILL TROLL" or "TAKE TREASURE," and sometimes the player must backtrack through several previous caves to pick up an object-such as a sword, a bag of food, or a jug of water- that is required to solve a problem in a later cave.

Adventure spread rapidly through the ARPAnet community and was played on the big mainframes and minicomputers that were available to the computer research labs where the hard-core hackers congregated. Such relatively sophisticated games, however, were inaccessible to the Altair hobbyists and homebrewers who were dealing with computer memory sizes far too small to contain the programming code for a fantasy simulation. Even when the larger-capacity machines like the TRS-80 and Apple appeared, it was widely believed that Adventure ate up too much memory to be programmed in one of the microcomputer languages used by those computers.

But software history is largely the story of people who accomplished tasks that had previously been considered impossible. In 1978, a young man named Scott Adams decided to take up the challenge of writing a simulation game for the TRS-80 microcomputer. He succeeded in solving this "insoluble" problem in the remarkable time of two weeks. His first game was Adventure Land, which was based on Crowther and Woods' Adventure and was totally nongraphic and text-oriented, like the original. It was an immediate success, and TRS-80 owners started buying it like crazy. Scott and his wife, Alexis, started a company, Adventure International, which grew into its own strange kind of game empire, and which seeded the software industry with Scott Adams protégés (Doug Carlston among them). In time, even Microsoft brought out a version of the original ARPAnet Adventure.

Just as VisiCalc introduced businesspeople to the benefits of computers, and WordStar showed wordslingers how computer programs could dramatically increase their productivity, Adventure-type games demonstrated to tens of thousands of non-businesspeople how much fun computers could be. And the idea, new at that time, that ordinary men, women, and children would find things to do with computers in their homes was buttressed by the success of Adventure Land.

But the home computer market had not yet blossomed, for there was a definite limit to the number of people who liked to solve puzzles by typing in primitive sentences like "PICK UP SWORD." The color television generation is very sophisticated when it comes to the matter of visual media, and the monotonous look of green alphabetic characters on a screen was far less attractive than the kind of fast-moving, high-resolution, brightly colored displays that most people were accustomed to watching.

Affordable personal computers cannot be built without microprocessors, but microprocessor-based computers would never have caught on if, in addition to having good software, they weren't also connected to television-like display screens. If computer output were still restricted to numbers and alphabetic characters printed on sheets of paper, as had been the case for decades, computers would still be confined to laboratories and data processing centers. Moreover, if it weren't for computer graphics technology, the home computer market would never have opened up as widely as it did and would instead have remained largely a community of hobbyists.

The reason for the importance of the video display is very simple: The visual sense is the way human beings assimilate information most efficiently. People can simply understand information far better if it is presented in a visual form. And games in particular are far more exciting if they have a visual component, especially if color and motion also are involved.

A computer processes information and presents it to humans in an understandable form. Census statistics, financial data, numbers, or text is entered via keyboards, punched cards, magnetic tape, or other input devices. The central processor of the computer compiles the statistics, sorts the data, performs calculations, and processes the text. All of this is a marvelous improvement over adding machines, ledger books, and typewriters, but it wouldn't have any meaning unless the processed information could be presented to people in an easily perceived form.

The idea that computer output could be presented in the form of images on a video screen seems natural today, but it was considered a radical innovation when it first came about in the late 1950s and early 1960s. Computer scientists might still be reading and interpreting esoteric printouts if the United States Air Force had not developed a computer that graphically displayed information about the state of the country's air-defense readiness. And the use of computer graphics might still be confined to the military if it weren't for those irrepressible hackers at MIT who created the first computer-based video games, like Spacewar, in the early 1960s. But Spacewar itself would not have been possible without the revolutionary breakthrough provided by a program called Sketchpad, written in 1961 by yet another brilliant MIT student, Ivan Sutherland. Sketchpad allowed computer users to produce and manipulate graphic patterns on video screens and to store the results in computers.

By 1976, the homebrewers began to experiment with various graphics devices for the Altair. They were able to produce kaleidoscopic effects and to devise very simple games, but the computers themselves were nowhere near powerful enough to recreate a minicomputer program like Spacewar. Ironically, then, the next step in the evolution of the personal computer industry did not come from the general-purpose computer technology of the Altair or its immediate successors, but from far less versatile machines.

In order for an electronic information-processing device to be called a computer, it has to be programmable. But by the mid 1970s, it was possible to build machines that, using "dedicated," nonprogrammable microchips, could create sophisticated graphic effects. These machines were called "video games," and although they didn't have much to do with microcomputers, they hastened the day when the personal computer cult would break through to infect the general population.

Video games resulted from the entrepreneurial combination of two different ideas that had been around for a long time. Since 1961 and Spacewar, it was obvious to those few people who knew about interactive computing and display technology that sophisticated games could be created by using video graphics and microelectronic controllers. Meanwhile, arcade games in the form of pinball machines had been a steady business for decades. It took another Silicon Valley entrepreneur by the name of Nolan Bushnell, along with several other partners, to decide to put these two factors together.

Forming a company that he initially named Syzygy, Bushnell tried to market the game machine Computer Space, a modified version of Spacewar, to bars and pizza parlors as an electronic substitute for pinball. The idea didn't catch on- only 2,000 machines were sold. He then tried something simpler. In 1974, he unveiled Pong and marketed it to the same places. It was an absurdly simple game by Spacewar standards; using joysticks, the players moved light-paddles to bounce a ball of light back and forth across the screen.

Pong was a hit, to put it mildly. After his first test machines were installed, Bushnell received complaints from the proprietors of these establishments that his machines had broken down. When he followed up on the complaints, he discovered that the machines were malfunctioning only because their coin boxes had been jammed full before anybody had come back to empty them! In fact, when Bushnell approached a number of pinball-machine manufacturers in Chicago with the proposal that they might have a financial interest in his game, most turned him down. Bally Corporation, a giant pinball-machine manufacturer, was interested at first but then backed away from an agreement with him, largely because the corporation couldn't believe the financial statistics he was presenting on the basis of his experience with his test markets.

Indeed, it was Nolan Bushnell and his crude video games, launched a year before the Altair hobbyists and homebrewers began experimenting with personal computers, that triggered the huge influx of capital which would ultimately transform the hobbyist community into the sizeable home computer industry. In time, the success of the arcade games spawned home versions. The idea that small, inexpensive versions of arcade games could be connected to home television sets ultimately turned out to be just the beginning of the evolution of increasingly sophisticated microprocessor-based devices for home entertainment.

People started pouring quarters into Bushnell's arcade machines- to the tune of millions of dollars yearly. Bushnell followed up his first hit with another successful but slightly more advanced game, Tank Command, in which simulated tanks stalked through a video maze and hurled blobs of light at one another. Syzygy was then renamed Atari, after a move in the ancient Japanese game Go, and Bushnell hired creative young programmers and electronics wizards to think up new games. A young man by the name of Steve Jobs was one of Atari's early employees.

This new company combined electronic expertise that had previously been associated with defense contractors and consumer electronics manufacturers with the playfulness and love of games that characterized the hacker era. Two years after the company was born, it was doing $39 million worth of business. In 1976, Bushnell sold his increasingly profitable enterprise to Warner Communications for $26 million. For about six years, Warner's buyout of Bushnell was one of the smartest investments in history: By 1982, Atari's annual sales had swollen to more than $2 billion.

Atari in its Warner incarnation was to play an important role in the rise and fall of many other home computer companies. But in the 1970s, before the first home computer companies got off the ground, it was the sophisticated arcade games of Japanese manufacturers that turned the video game industry from a multimillion-dollar infant into a multibillion-dollar giant, unleashing an unexpected social phenomenon in the process. The key to the video game craze was in the "dedicated" hardware that enabled manufacturers to develop visual displays that were far better than those possible on the first programmable microcomputer systems.

Those games made possible an entirely new kind of sensory experience. In a real sense, they were even selling an experience- a means of ventilating primitive fight-or-flight reflexes through the various "zap the aliens" scenarios, through a visual and auditory encounter of hypnotic intensity, and through cognitive and perceptual gymnastics that offered immediate and quantitative rewards. The flashy high-resolution graphics ("high res," as it is known in the industry), the direct interaction via joysticks or buttons or trackballs, the electronic sound effects, and the way the games were designed to keep proficiency, difficulty, and level of reward in a delicate balance all seemed to precipitate an unnatural hunger in a large portion of every nation in the world- the youth.

The staid old pinball manufacturers who had turned away from Bushnell's video game simply had not foreseen the arrival and the nature of this vast, new, very different market. Nor could they have dared to predict how willing the game's customers would be to pay for their experiences. The rate of return on pinball machines had been holding steady for decades. Depending on the popularity of the machine, the weekly return per unit was measured, at best, in hundreds of dollars. The very first video games, way back in the Pong and Tank days, took in thousands weekly. But whereas the pinball manufacturers in Chicago might have not noticed the existence of this torrentially lucrative new market, the game manufacturers in Tokyo certainly did.

Japanese companies had been manufacturing coin-operated games since the late 1960s, and in the late 1970s they began distributing sophisticated video game machines to the Japanese market. The games were such a huge success that there was for a time a critical shortage of circulating coins in Japan. The devices were virtually sucking the money out of circulation. As a result of this almost shocking success, Japanese manufacturers of coin-operated games began to consider distributing their products to international markets. The breakthrough of Japanese arcade games into the international market came in 1978, with Space Invaders, a fast-moving, colorful video game in which the player fires brightly colored "lasers" at swiftly descending "alien invaders" and tries to accumulate as many points as possible before the invaders destroy the laser bases. As the player gets a higher score, the invaders move more quickly and the electronic background music becomes more urgent. This kind of game, known as "shoot-'em-ups," quickly gained revenues of billions of dollars per year-all of it in coins. The video arcade games had become an enormous industry.

But in the meantime, in 1977, the Warner version of Atari was putting itself in a position to mine an even larger video game bonanza by building cheap game machines that could be plugged into home television sets; these devices, also "dedicated" rather than programmable, enabled people to play the enormously popular arcade games like Pong and Tank, and eventually Space Invaders, in their homes. The first versions of these devices sold extraordinarily well, but they were soon abandoned by their owners because they were limited in their capabilities and offered only one or two games.

Then, in the same year, Atari introduced the VCS 2600, which offered replaceable cartridges called ROMs. By plugging in a new ROM cartridge, it became possible to play the latest game on the old machine, instead of discarding the entire machine for the latest model. This is known in the industry as "giving away the razors and selling the blades," except in Atari's case it was selling both the razors and the blades. Although they were pieces of hardware, the ROMs were actually a form of software; inside each cartridge was the program that enabled the machine to play a new game. Their eventual success with consumers marked the beginning of a broadly based home video game industry and the point when software became a significant consumer commodity.

The home video game industry soon rivaled even the intensely profitable arcade game industry; Atari's revenues rose to more than a billion dollars per year. It was the most phenomenal rate of growth of any new industry in history, and an experience far beyond any CEO's wildest dreams. And Atari's CEO at the time was Raymond Kassar, who had recently been hired by the notoriously charismatic Warner chairman, Steve Ross. Sophisticated, sybaritic, highly cultured, Kassar had spent his entire career at Burlington Mills, a textile company, before coming to Atari. He wasn't a Silicon Valley entrepreneur and didn't care much for engineers or programmers, but for a while he happened to be in exactly the right place at exactly the right time.

After the Japanese introduced Space Invaders, they brought out Galaxians, which had even more variations on the "shoot the invaders" theme, and then Pac-Man, which created the "munch-'em-up" genre. Incredibly, people spent more than a billion dollars' worth of quarters a year simply to play this one arcade game. Then Atari licensed from the Japanese the rights to create the VCS (home) version of Pac-Man, and a programmer was hired to do a quick-and-dirty version for a reported $1 million. Ironically, the game programmer who created the original Japanese version of Pac-Man didn't make a penny in royalties.

By 1982, Atari had more than 80 percent of the huge and still growing home game hardware and software market. The VCS might not have been a computer, but those cartridges certainly were software, and a programmer was needed to create the code that was embodied in the cartridges. That meant that there were juicy royalties to be made. More than that, some programmers began to feel that they ought to get some personal recognition for their accomplishments.

While Kassar was buying apartments in the most expensive co-ops in the world and flying around in a luxury jet, a few of the programmers who were creating Atari's wealth decided that it was time for them to get their names printed on the cartridge's package. Kassar tried to ignore them, but they were persistent. In May 1979, four of Atari's best game designers, dressed in jeans, finally met with the elegantly tailored Kassar in his office. In an interview in an issue of Info-World (Vol. 5, No. 48), one of the programmers, Larry Kaplan, summed up the CEO's reaction: "He called us towel designers. He said, 'I've dealt with your kind before. You're a dime a dozen. You're not unique. Anybody can do a cartridge."

The following October, three of the four game designers- Dave Crane, Al Miller, and Bob Whitehead- left Atari to form Activision. Kaplan joined them a short time later. According to Kaplan, "Activision was started to prove that Kassar was wrong." But as an awesome collection of talent and a burning desire to succeed are still not sufficient to build world-class company, the programmers soon got together with a Jim Levy, a CEO whose previous experience had included involvement with a music and software company as well with Time, Inc., and Hershey Foods. Eventually Silicon a venture as capital partner- Sutter Hill, one of the hottest in Valley- joined Activision's effort.

As the first company in the video game business to concentrate exclusively on software, Activision was breaching the wall between the still financially puny but ever more sophisticated home computer software market and the relatively one-dimensional but monstrously profitable video game software market.

According to Levy, Activision's original business plan was based on three fundamental ideas. The first could be called the "appliance hypothesis"- a prediction that during the 1980s the computer would become as important in American homes as the television, radio, stereo, and automobile. "The second fundamental idea," Levy said in a June 1984 interview in Atari's Antic magazine, "was that software was going to drive the market." And the third idea was based on the premise that a software company requires an approach and focus different from that which a hardware company requires. This corporation, after all, was going to prove that programmers weren't towel designers.

Activision was an instant hit, and when the company went public, it made the founders Silicon Valley millionaires. By 1982, after only two years in the business, they topped $150 million in sales. Activision's best-selling product, a game called Pitfall, sold more than three and a half million units worldwide. In subsequent years, the company marketed four more games that sold more than a million units each. The Activision founders far surpassed their original goal of proving something to Atari: In effect, they created the concept of the software star. The company not only put the programmers' names on the game packages, it also put their pictures and bios in promotional materials and sent the programmers themselves on speaking tours.

Activision wasn't Kassar's- and Atari's-only problem. In 1981, when another group of game designers left Atari to form Imagic, Kassar called them "high-strung prima donnas," and afterwards many of the programmers who remained at Atari soon sported T-shirts with the words: "I'm another high-strung prima donna from Atari." But with annual sales figures approaching the billion-dollar mark, Atari was in no mood to listen to doomsayers who decried the erosion of their programming talent.

Atari's VCS was soon followed by other systems, including Odyssey, Intellivision, and Colecovision. Other game companies proliferated, and some of the country's largest companies became involved- notably Quaker Oats, CBS, and Mattel. By the time the video game business reached its peak, in the early 1980s, another up-and-coming industry was making inroads- the home computer market. Both the hardware and software components of personal computers had grown more powerful and less expensive since the days of the Altair. When microcomputers first became powerful enough to accommodate sophisticated game programs, complete with graphics, Broderbund and I entered the story.

Links