Programmers Who Defined the Tech Industry: Where Are They Now?
Some early programmer names are familiar to even the most novice of software developers. You may never have seen a line of code written by Bill Gates, or written any application in BASIC (much less for the Altair). But you know Gates' name, and the names of a few others.
That's a darned shame, because the early microcomputer era (we didn't uniformly call them "personal computers" yet) had many brilliant software developers. Some of them went on to greater fame and fortune; others disappeared into the mists of history.
In 1986, Susan Lammers did a series of interviews with 19 prominent programmers in a Microsoft Press book, Programmers at Work . These interviews -- many of which the author transcribed on her own website a few years ago -- give a unique view into the shared perceptions of accomplished programmers... the people who invented the tools you use today.
Lammers' book is still enjoyable on its own merits, simply because it's a bunch of very smart developers talking about their craft at a time when there were relative few wide shoulders to stand upon. In re-reading my copy, I was reminded by how raw the technology industry was, how many competing standards we had to weed through (it had been years since I saw a list of the late-1980s databases: R:Base, Paradox, Javelin, ANZA) -- and yet how soon we expected to achieve Artificial Intelligence.
I chose five of the book's programmers for this time machine, leaving the remainder's fate primarily an exercise for the intrepid reader.
[ See also: Priceless! The 25 Funniest Vintage Tech Ads ]
Here you'll learn the then-current opinions from Dan Bricklin (VisiCalc), Jonathan Sachs (Lotus 1-2-3), Robert Carr (Ashton-Tate Framework), Bill Gates (BASIC on microcomputers), and Charles Simonyi (Multiplan).
The Spreadsheet Wizards
Dan Bricklin: VisiCalc
Then: The VisiCalc spreadsheet was the original "killer app," by which we used to mean: "This application is the reason you have to buy a computer." I knew a mechanical engineer who snuck his Apple IIe into the office so he could run VisiCalc on business problems, when the waiting list for the Control Data mainframe was months long. VisiCalc, created by Dan Bricklin and Bob Frankston sold over half a million copies by 1983; the company that marketed it, Software Arts, imploded due to legal battles.
On innovation: "I went to see my finance professor [with the VisiCalc prototype], who was discouraging. He looked up from his printouts and he said: 'Well, there already are financial forecasting systems and people won't buy micros, for example, to do real estate.' . . . Of course, now Harvard requires you to buy a PC before you can go to their business school."
On the evolution of programming: "People are writing their own programs. Anybody who uses a spreadsheet is writing their own programs; it's just that the language is different now. . . . We're just making the users do more and more of the programming themselves, but they don't know it. Using different style sheets with Microsoft Word is doing programming; using spreadsheets is doing programming."
On the business of software development: "There is an inherent cottage industry component to programming. . . . Any large company could have a million programmers working on some idea to produce a better one, but they don't because it doesn't work that way. The economics aren't there. Sometimes the idea behind a program is one small creative effort. . . . In terms of marketing, you do need a large marketing organization. But there's a variety of ways to do that, just like with [publishing] books. . . . If you look at the biggest sellers in the software industry, in general they were written by very few people."
On the future of computing: "All sorts of products are developing now. People are saying, 'What about networking, why can't we connect this stuff together?' What about the publication systems, like the inexpensive ones available on the Mac? And the really great ones that are available on the bigger machines, like Apollo and Sun? A few years ago, no one thought that in-house publishing was going to be a major use of computers."
On the future of computer hardware: "The personal computer of the future should be more like a notebook. I carry my notebook around and why shouldn't it be a computer? Well, that's different than the PC as we know it. Computer technology is going to be used for all sorts of new areas like that. ...One way to have a computer follow you around is to miniaturize it. Why go to all the trouble to put legs on it if we can miniaturize it to the point that we can carry it on our bodies. We're getting to the point soon where we can get a lot of computing power in a very small space."
Today: Bricklin runs a tiny company called Software Garden whose products include Note Taker for the iPad and a video, A Developer's Introduction to Copyright and Open Source, He also does consulting and speaking.
Bricklin spoke with me about software methodologies, programming career choices, and other issues.
Lammers asked many of the programmers, "What kind of training or type of thought leads to the greatest productivity in the computer field?" but for some reason didn't ask this question of Bricklin. I remedied the oversight.
In many ways, he says, their needs were different. Then, you needed experience in a variety of areas, which may not be as useful today. It certainly is good to have a varied background; as Bricklin explains, "For me it was very helpful to have a background in many different languages," since he could choose the appropriate language for the current application rather than "the one I know." Having a range, he says, keeps you from getting stuck in one system; and when new things come about, you aren't as lost.
But one thing is and was necessary: experience shipping a product. You should know, he says, "what it is to actually complete something and get it out the door. That's a real important thing to learn." Nothing beats the experience of shipping software, to take something from start to finish. You get feedback from users, and find out what you did right and wrong. It's even better, he says, to do this with other people, from whom you can learn "what it means to get all the pieces together of the project complete enough for you to get it out the door."
Bricklin programs much the same way he did in the 1980s. "One thing I've always done, for many years -- I know Bob Frankston did, too -- you have to figure out a path through the whole thing and implement that first," he says. And then you go back to add more. "Getting the path through is like building a scaffold. A lot more is put on [the application] before it's a real product," but you have the critical path in place. "It's always building off something that's working."
What about his premise, in 1986, that software development was inherently a "cottage industry?" That's still true, Bricklin says. "There still are small companies. And there are still small companies with leading companies in several [product] areas." The best chances for this may be the "App Store" marketplace and open source. Bricklin has some successful open source projects, and he is now in the app world. "Some of what I'm doing is like what I was doing 20 years ago," he adds.
Some products are more complex, and as a result, "The big companies do things that can only be built by a big company with many hands," he says. "That's always been the case and it looks like it is going to continue."
But individuals can still do things that are worthwhile and that they can make money on. "If we forget that we're going to lose a lot of creativity," says Bricklin.
Jonathan Sachs, Lotus 1-2-3
Then: In 1981, Jonathan Sachs teamed up with Mitch Kapor to develop and promote Lotus 1-2-3, the software that brought the IBM PC into so many corporate offices.
On programming methodology: "The methodology we used to develop 1-2-3 had a lot to do with the success of the product. For instance, 1-2-3 began with a working program, and it continued to be a working program throughout its development. ...This was the exact opposite of the standard method for developing a big program, where you spend a lot of time and work up a functional spec, do a modular decomposition, give each piece to a bunch of people, and integrate the pieces when they're all done. The problem with that method is you don't get a working program until the very end." This works fine with more than three people; they used a team approach with Lotus Jazz.
On the future of computing: "The rate of innovation is rather slow. There are only a few really new ideas every decade. In fact, people complain about the good old days of paper tape and such things, but some of the old technology was really good. And I'm not sure much progress will be made over time. . . . We're seeing all these new processors, but a lot of the power is lost because everyone wants all the features, and that slows everything down."
Today: Sachs is "mostly retired" these days, though he has a finger in a company called Digital Light & Color, which, since 1992, has made software for photographers. He recently has been "playing around" with Android phone software but doesn't know yet what he'll do with it.
Sachs was kind enough to speak with me about his current and past perspectives on programming.
I showed him the quote above about developing "a working program," which sounds a lot like what we'd call Agile today. Does he still write software the same way?
"It works that way for me," says Sachs. People have their own ways of working, however, and everyone has their own natural style. "I ran into a guy at Lotus, later, who would spend a long time thinking about the program. He would type in the whole program in the final form and debug the whole thing," Sachs explained. "There are some virtues to that. You have anticipated difficulties, you don't get stuck on dead ends." But, he says, development takes a lot longer and there are things you don't realize until you start working on the program.
One thing that has changed from the "frontier days" is that today typically a developer works on only a little tiny piece. "There was a day when you could know everything there was to know about a given computer," he says. In his previous positions pre-Lotus, "I got to write computer languages and databases and scientific software. By the time I got to write a spreadsheet I knew all the pieces."
Reading other people's code is still an important way to learn, because "When programmers read each other's code you can see if someone is really good or not," he says.
Sachs agrees with Malcolm Gladwell's Outliers theory that you have to spend 10,000 hours doing something to get really good at it. "That's really true of programming, and I've been doing it a long time," says Sachs. But this generation's programmers started much earlier, he points out, and whole generations of kids will get their 100,000 hours of experience much sooner, perhaps leading to proficiency in their careers earlier.
Software's Next Wave
Robert Carr, Framework
Then: In the mid-1980s, the computing world was yearning for an integrated software suite, or at least all the magazines told us so. The two major choices were Ashton-Tate's Framework and Lotus Symphony. Robert Carr co-founded Forefront and developed Framework, which had a spreadsheet, word processing, database, telecommunications, and outlining, all on a floppy disk. Ashton-Tate bought Forefront in 1985, where Carr became chief scientist -- his role when he was interviewed for the Programmers at Work book. Borland acquired Ashton-Tate in 1991.
On programming in teams: "I've tried to surround myself with people who are better than I am. A lot of the people I've hired for Forefront are better programmers than I am, and I've learned a lot from them. [Xerox] ASD also showed me that great software development is largely composed of good initial design and, thereafter, a lot of very solid engineering."
On software design: "One piece of advice I had been given was to hold off programming for as long as possible. Once you get a corpus of code building up, it's hard to change direction. It sets like concrete. So I held off for as long as I could, but I couldn't hold the design in my head forever."
On managing developers: "My role is one of facilitator, drawing out the design ideas and helping the group towards a conclusion. Not my conclusion, but one evolved by the group. . . . Occasionally, there will be a situation where we just can't get a consensus, so we step back and examine the time constraints, money constraints, or space constraints, and then decide from there. The original Framework process was a very iterative and evolutionary one."
On user interfaces: "Users should be able to forget that there is a program between them and their information. In fact, as they get used to the software, their minds should be filled only with the task at hand; they shouldn't have to stop and think about what command they need."
On the future of computing: "I hope we can move toward component software. Then the user will be able to replace a piece of a program with a plug-in module from a software house that knows how to do floating-point arithmetic or word processing better. This will be a trend over the next ten years. But it's a tough goal because we're talking about the interfaces between separate modules of software, one of the least understood areas in software design."
Today: Carr has wandered in and out of the computer industry. He took breaks to participate in Ironman triathlons and, between Xerox and Context MBA and starting Forefront, to explore Mexico.
However, he's never gotten very far from the innovations we later took for granted. In 1987, Carr co-founded the high-profile mobile communications startup GO Corporation, where he led all software development, including the ground-breaking PenPoint operating system, earning two patents on pen-based computing and object-oriented operating systems. He was vice president of the AutoCAD Market Group at Autodesk. And he spent a few years as managing director at Sofinnova Ventures, where he invested and co-managed $550 million in early stage high-tech venture capital funds (though 1998-99 didn't turn out to be a good time to do that).
These days, he's CEO of KeepandShare which aims "to support your busy life by making group information sharing easy, secure and instantaneous." And he's back programming for the first time in 20 years.
Bill Gates, BASIC on the Altair
Then: In 1986, Bill Gates was already "considered one of the driving forces behind today's personal computing and office automation industry" (at least when being interviewed for Microsoft Press) and lamented that he didn't have time to write code personally anymore.
On programming: "We're no longer in the days where every program is super well crafted. But at the heart of the programs that make it to the top, you'll find that the key internal code was done by a few people who really knew what they were doing.
It's not quite as important now to squeeze things down into a 4K memory area. You're seeing a lot more cases where people can afford to use C, instead of using assembly language. Unfortunately, many programs are so big that there is no one individual who really knows all the pieces, and so the amount of code sharing you get isn't as great. Also, the opportunity to go back and really rewrite something isn't quite as great, because there's always a new set of features that you're adding on to the same program."
On software performance: "It's true that we're going to allow programs to be a little fatter than they have been. But in terms of speed, it's just laziness not to allow something to be as fast as possible, because users, even though they might not be able to say so explicitly, notice programs that are really, really fast. In the most successful programs, the speed of execution is just wonderful."
On the future of programming: "People still get great satisfaction out of the fact that a compiler, like the C compiler, still can't write code as well as a human being. But we may mechanize some parts of the process quite a bit over the next three or four years. People will still design algorithms, but a lot of the implementation could be done by machines. I think that within the next five years we'll have tools that will be able to do as good a job as a human programmer."
On Microsoft's future: "Even though there'll be more and more machines, our present thinking is that we won't have to increase the size of our development groups, because we'll simply be making programs that sell in larger quantities. We can get a very large amount of software revenue and still keep the company not dramatically larger than what we have today. That means we can know everybody and talk and share tools and maintain a high level of quality."
On the future of computing: "One of the new areas we're focusing on at Microsoft is compact-disk applications. CD ROM is the technology we're going to use to get personal computers into the home. . . . CD ROM is totally different. We hope with CD ROM you'll be able to look at a map of the United States, point somewhere, click, zoom in and say, "Hey, what hotels are around here?" And the program will tell you. And if you're in the encyclopedia and you point to one of Beethoven's symphonies, the computer will play the song. It's a new interface; it's got nothing to do with productivity tools like word processors or spreadsheets."
Charles Simonyi, Multiplan, Alto Bravo, and Hungarian Notation
Then: Hungarian-born Charles Simonyi already had an impressive background before he joined Microsoft in the 1980s.
Like so many other programmers of the early microcomputer era he was an alumnus of Xerox PARC, during which he created the Bravo and Bravo X programs, the first WYSIWYG (what you see is what you get) text editors, for the Alto personal computer.
At Microsoft, Simonyi organized the company's Application Software Group, which produced Multiplan, Microsoft Word, and Microsoft Excel. He's also well known in programming communities for instigating the Hungarian Notation: a formulaic way to name programming variables inside an application.
On code lifetime in software development: "Really good programs will live forever and take forever to write, at least as long as the hardware exists, and maybe even longer. Certainly, Bravo lived for as long as the Alto existed. . . . There were about fourteen releases over about a five-year period. . . . The same thing is going to be true for Multiplan. When you consider that Multiplan lives in Microsoft Excel, then Multiplan is going to be a continuing story. And Microsoft Excel on the Macintosh is not going to be the last application in the chain either. It's going to continue on Windows."
On computing future: "Who knows? Maybe computer science will help decode DNA, and not just by supplying tools. Disassembling DNA could be a hacker's ultimate dream."
Today: Simonyi stayed at Microsoft until 2002, and ended up as Director of Application Development, Chief Architect, and Distinguished Engineer.
Today, Simonyi is chairman, CTO, and founder of Intentional Software Corporation which, according to its website, "accelerates innovation by integrating business domain experts into the software production process." Simonyi has been a member of the National Academy of Engineering since 1997, a member of the American Academy of Arts and Sciences since 2008, and a Correspondent Member of the Hungarian Academy of Sciences. According to the corporate bio, he is an avid collector of modern art, enjoys classical music, and is an experienced pilot.
The Other 14 Programmers
What about the rest of the programmers Lammers interviewed? Some seem to have disappeared entirely; I don't know what happened to John Page, who wrote PFS:File. Others are obscure to the average programmer today, such as Jaron Lanier , who wrote Atari games and was an early proponent of virtual reality worlds, or LucasFilm SoundDroid's Michael Hawley. For similar reasons, I didn't try to find Peter Roizen (T/Maker), Butler Lampson (Alto PC), or Scott Kim (Inversion).
But here are short updates about the rest, based on my online research:
Toru Iwatani, author of Pac Man, is now, according to Wikipedia, a full-time lecturer at Toyko Poly-Technic.
Andy Hertzfeld (MacOS) worked at Apple until March 1984; he was interviewed in the book as author of a program called Switcher for the Macintosh and a low-cost, hi-res digitizer, ThunderScan. Since then, he co-founded three companies: Radius (1986), General Magic (1990), and Eazel (1999). In 2002, according to Wikipedia, he helped Mitch Kapor promote open source software with the Open Source Applications Foundation. He also started a site, folklore.org, to share anecdotes about the development of Apple's original Macintosh computer, and the people who created it. Hertzfeld joined Google in 2005.
Ray Ozzie was interviewed for the book because of his affiliation with Lotus Symphony (which beat out Ashton-Tate Framework in the marketplace, at the time).
You might know him better because of Lotus Notes, which would have been a quiet twinkle in his eye in 1986. Now, of course, he is stepping down from his role as chief software architect at Microsoft.
Also still working in the industry is John Warnock (Adobe PostScript), who is also among the few who are still affiliated with the same company (though Adobe is no longer known primarily as a printer OEM).
Warnock was president of Adobe for the company's first two years and CEO for the next 16 years. Warnock retired as CEO in 2000 and as the company's CTO in 2001, according to Adobe's website. Today, he is co-chairman of the board with Charles Geschke, continuing to shape direction for the nearly $3 billion company.
Spreadsheet guru and Bricklin's partner Bob Frankston (VisiCalc) joined Lotus in 1985, where he created the Lotus Express product and a Fax facility for Lotus Notes.
He worked for Slate from 1990-1992 on mobile and pen-based systems, then at Microsoft (1993-1998) with particular attention to home networking. He's still thinking about networking.
We have lost at least two:
Apple's Jef Raskin , instrumental in the Macintosh project, died in 2005 from pancreatic cancer.
Digital Research's Gary Kildall , best known for the CP/M operating system, died in an accident in 1994.
After DRI and CP/M, he worked on an early GUI environment that competed with Windows, GEM (remember the original Ventura Publisher. . .?). He sold the company to Novell in 1991 for $120 million, and started another company, KnowledgeSet, which adapted optical disk technology for computer use.
My most frustrating search was for Wayne Ratliff , best known for dBase II. According to a 2007 interview, he had retired and was working on his boat, along with computer systems for competitive sailboat racing. I found no Ratliff spoor since 2007, however. Which, given the age of some of these guys, gives me a bad feeling.
Looking back on looking forward
In these quotes I concentrated on the programmers' ideas about programming, its intersection with the world of business, and their predictions of the future. They spoke about many other things: whether artificial intelligence was a reasonable goal, the first program they were paid for, the connection between music and programming. But topics I chose attracted me because I wanted to see how the craft-or-science changed (or didn't), and whether these brilliant men, each of whom invented something meaningful, could also envision where our industry was headed.
In some ways, they did extremely well -- particularly when it came to hardware. As WordPerfect's Pete Petersen said in a keynote address to my Island/Reach Computer User Group in Maine, a year or two later, one should always bet on computers getting smaller, faster, quieter, cheaper, and more reliable. Notebook and mobile computing was, perhaps, an inevitability.
But they were very centered on the client PC. None of these programmers predicted the Internet, or even the long-term effect of computer networks. That wasn't surprising (except in these sense that we expect brilliant people to be smart about everything); in 1986, there was no worldwide web, the Internet was primarily Usenet, and we relied on proprietary online services like CompuServe, available only on dial-up connections.
However, the attention only to client PCs had long-term implications. These developers were thinking about designing component software, which led to OLE (object linking and embedding) and OpenDoc. Whereupon the Web made most of those issues moot, from early uses of graphics embedded in webpages to today's mashups. There's a conclusion to be drawn from this, though I'm not sure exactly what it is.
I saw one trend that perhaps is a bit frivolous, but might also be a view into a hacker's mind. Many of these programmers were attracted to flying and boating. These are expensive hobbies suited to guys who have plenty of money to spend, but I saw a strong correlation with these endeavors and the programmers who moved into management. Aha: sailboats and airplanes made sense. Both involve going fast in a powerful, complex, engineered device that takes expertise and dedication to master. Much like an early computer, in which you needed to know everything about the machine to be effective. If you can't hack code anymore, you certainly can appreciate the beauty of hardware.
Simonyi is the best example. In his interview (which he later mused about), Simonyi had just gotten into flying helicopters. That was only the beginning of his "flight" experience, however, as Simonyi joined the small set of "space tourists" when he participated in the Soyuz TMA-10 mission in 2007 and the Soyuz TMA-14 mission in 2009 to the International Space Station. (Another computer industry based space tourist is Ubuntu's Mark Shuttleworth.)
Most of these programmers had (and have) a programming methodology that today would be called Agile. They mostly created a prototype that worked, and kept adding functionality until it was ready to ship. They worked iteratively in small teams. And, as Bricklin's current thoughts indicate, these developers were always cognizant that at some point you have to quit adding to the software and send it out the door. I found myself wondering how many readers imagine that "Agile" is something new.
At a personal level, there seem to be two paths for these accomplished developers. Either they grew along with the companies they started, moving into management and giving up programming. Or they went back to a small shop where they could do whatever they wanted, as both Bricklin and Sachs have done; some appear to have found corporate jobs where they could continue to research and innovate on their own terms, which is pretty cool.
All in all: This generation of computer industry pioneers -- who are figuring out how to exploit the Internet, make software mobile, and keep the user interface intuitive -- can be proud of the early microcomputer programmers.