An Interview with Laine Nooney, author of The Apple II Age
The founding of Apple has become a modern-day myth, recounted in biographies, biopics, and even a Library of Congress research guide. We’re familiar with the story of the two college dropouts who hacked away in a Los Altos garage, and through dedication and commitment to their vision, eventually changed the world. But to solely focus on the prominent heroes of computing obscures the complexity of the ecosystem that popularized and commercialized general-purpose home computers. Media historian Laine Nooney researches the funding structures and business models that gave rise to the computer revolution in the late 70s, arguing that “their most ardent prophets are not the sole keepers of their meaning.”
In The Apple II Age: How the Computer Became Personal, Nooney recounts the stories of several software companies that developed programs for the Apple II, how they were funded, and who used their software. Nooney argues that despite the entrenched hacker mythos, computing was never anti-commercial at its core and the industry has always been driven by venture capital and speculation. Rather than an inevitable triumph of technology, the adoption of computers took tremendous effort from these players who had to transform consumers into users. The book is a readable account of industry formation, following the circuitous process of selling a product that doesn’t solve a specific problem. Laine Nooney is an Assistant Professor of Media Industries at New York University, and recently joined Scope of Work's Members’ Reading Group for a conversation about The Apple II Age. What follows is an edited and condensed transcript of our discussion.
Hillary Predko: In your book, the Apple II itself is something of MacGuffin, leading the reader through the history of the computer industry’s formation, particularly around software. That being said, you do argue that the Apple II’s hardware was important in supporting a market for software and helping transform microcomputers from a “technical oddity to a mass consumer good.” What hardware decisions supported this ecosystem?
Laine Nooney: A MacGuffin is a good term for it! This book is trying to do a sleight of hand, right? It's doing a bit of a magic trick by pointing to this historically significant object, but showing how the story of the personal computer is actually about this whole other set of things.
The viability of the Apple II really had to do with Wozniak's point of view on what computing should be: It was serviceable and expandable, whereas computers today are designed to be basically replaceable. In 1977, the other two competitors were not particularly expandable; the Commodore PET was not and the TRS-80 was very clunky to expand. But the Apple II had this engineer's elegance where you just take the top off and all the ports are in the back. It was easy to increase the amount of RAM, and I think the expansion ports along the back bay (which allowed for you to effortlessly add peripherals) was game changing. None of the other two systems that came out in 1977 had that. Also, there was a lot of planned obsolescence in early micro-computing. The PET gave way to the Commodore VIC, which gave way to the Commodore 64 in a span of seven years – and in that whole time, the Apple II was still Apple's primary machine.
With the Apple II, there was a thoughtfulness and a tinkerers’ mindset that really trusted the users. I mean, the TRS-80, you couldn't even open it – it was a hacker's computer in software only. Obviously, Apple did not maintain that platform openness for very long. Once Jobs took control of the Mac team, the model of computing became very different. But I think the Apple II's longevity has a lot to do with that fundamental flexibility of the hardware.
HP: Throughout the book, you point to the creation of “the user” as a pivotal way home computing became normalized. However, most histories of this time focus on homebrew hobbyists. What do we miss when we only focus on a small group of technically savvy users?
LN: I was trying to set a different center of gravity for talking about what made computers mass cultural objects. There has been, particularly in the history of computing, a ton of attention on the hobbyist. Part of the reason is hobbyist groups tend to leave behind clear archival records; all of their newsletters can get swept up and put into a museum archive where they can get scanned and put on the internet archive. There are identifiable people, there's a lot of technical intrigue. I also think that there is a predisposition in humanities academia to be focused on themes of resistance or personal cultural production – there's an interest in the periphery rather than the center.
This means there are so many histories of modders and hackers and coders, and yet we don't have a single cumulative history of Microsoft. My approach to the history of computing has been to ask, “What are we doing?” Not to say that it isn’t important to study the hobby groups, but I can read 50 papers about hobbyists and I can read no papers about the founding of Apple. We have our priorities a little fucked; we can't understand the periphery without an account of the center.
There has been this assumption that we know the story of the center, that we know Microsoft and we know Apple. My position has been: “Whoa, whoa, whoa! No, we don't. We don't actually have an account of any of that, other than popular journalism and self-aggrandizing garbage.” I think shifting the story about early personal computing from a story of the hobbyist to a story about use separate from programming is really important if you want to understand the personal computer as a mass-manufactured consumable good. It’s what actually makes the computer “personal.”
Read the full story
The rest of this post is for paid members only. Sign up now to read the full post — and all of Scope of Work’s other paid posts.
Sign up now