Cupertino (CA) – The Apple I was never a particularly well-known computer. It was talked about in computer conferences – especially those to which it failed to show up – as a single-board 6502-based system that wasn’t a kit, although the one thing it really lacked was a genuine case. Photographs from that time show some Apple Is with copper pipe-like adornments and hand-soldered lettering, and some simply laying exposed in plywood boxes. There was never anything particular about it that could catalyze an industry. It was a trial run.
The first real microcomputer that most Americans living in the late 1970s ever saw or touched, was the Apple II. For many, it was the first programmable computer with which they ever came into contact. In retrospect, the Altair 8800 – a kit computer that premiered in the January 1975 issue of Popular Electronics – was technically the first genuine personal microcomputer, and the IMSAI that followed it was next. Between those two and the Apple II, there were perhaps hundreds of kit designs and boards, including the Apple I, some of them self-contained. In a number of key respects, the Apple II was never really “first.”
From a 1979 Apple Computer sales brochure
Yet the moment the Apple II entered our lives – for some, the very instant it was unpacked – we could feel a change in the air. This computer looked like something. Indeed, its looks alone, even when turned off, were principal to its success. It wasn’t a bunch of wires hanging out of a box, or a couple of arrays of red, flashing LEDs, hopelessly representing the unfulfilled desires of restless husbands seeking refuge in their workshops on the weekends, like almost everything that came before it. No, this was something real, tangible, measurable. Granted, no microcomputer manufacturer of the mid-1970s had a recognizable brand, except perhaps Commodore, and even then mostly in Europe, so the fact that no one knew who “Apple” was didn’t really matter much. No one knew Ohio Scientific, Exidy, Osborne, Sinclair, and the other early entrants in that era either. The Apple II seemed more real, somehow, because it felt real.
The biggest problem with helping to usher in the era of microcomputers during this time was finding the words to describe what these things were, and what they did. This was an unusually difficult task. I was a (very) young entrepreneur myself in 1978, the year I spent my first moment with an Apple II that a friend of mine – or more accurately, his dad – had just purchased. Like most video games we had ever played with – particularly one called “Video Breakout” from Atari whose case seemed strangely similar to the Apple II’s – it was connected to a television set, not a monitor. There was no disk drive. Instead, a hi-fi cassette player was rigged to serve as the device for loading programs. A second tape player, which wasn’t connected, was reserved for fast-forwarding tapes to the little gaps of silence between saved programs, so we wouldn’t have to load the first, second, and third program on the tape just to get to the fourth.
So someone staring at our growing gang of self-proclaimed engineers, as we wound and rewound stacks of tapes and perused textbooks on Applesoft BASIC and 6502 machine language, amid the residue of Doritos and Dr. Pepper, while we strove just to get the II’s “high-resolution graphics” mode to plot a more perfect circle, would garner no clue as to our motive. But prior to the Apple II, anyone studying a group of young hobbyists hunched over a microcomputer, wouldn’t find anything about the scene to capture his interest or compel him to hang around for long. For some reason, the Apple II drew a crowd. The case – which looked like a real something – was the first clue.
Color was the second. Making color bars appear on a television set, and programming them to change shade, made people sit up and take notice. Making the II’s single speaker play its beeping noise with a variable frequency, dazzled some in the way the Ohio Scientific board’s display of playing cards with all four suits represented by their own characters – in black and white – failed to do. The Apple II was the first device to reach beyond its own local interest group, if you will, and appeal to people who would otherwise never have taken an interest. The importance of this appeal is historically and sociologically significant. Appearance, color, and sound attracted the attention of the only special interest group that truly mattered to the Apple II’s ensured, long-term success: parents. Would-be hobbyists weren’t always those who could shell out the $1600 for a 16K device, but parents would, if they could see tangible results that programming or building onto this thing could rapidly advance their children’s educational skills…and, along the way, build up their base of friends.
And so it was that my friend – the one with a rich dad – beat me to the punch in being the Apple II’s earliest adopter in my community. I ended up as the early adopter (in the US) of the Commodore PET 2001, which never quite garnered the respect of the Apple II, due largely to the fact that it 1) didn’t make comparable beeping noises, 2) lacked color, and 3) was ugly, like a statue of the emperor Napoleon built from a handful of Legos.
The magazine that bonded together the new community of computer users: Creative Computing, which introduced thousands to the Apple II.
Being able to leverage the vantage point of history to our benefit, we can almost define the Apple II in some key respects for what it was not. It was not a graphical program tool, without a “user interface.” Instead, it used a prompt, like a terminal being operated in some far off, sealed away headquarters underground somewhere. In a way, communicating with a computer via a prompt was more conversational back then, and as a result, frankly, more intimate. We got to know our computers more than we do now. The experts in various brands could recall the locations of certain routines in ROM, having memorized their machines’ memory maps.
Here in the late ’70s and early ’80s, the three key distinctions between classes of computer experts – engineers, administrators, and developers – had not come about yet. Instead, we were all one embryonic class, yoked together by a common quest to solve the greater problems that these new computers made obvious to us. There was no Internet – at least, not that we could see – and yet the world became a bigger place for us even without it. The world introduced itself to us through a new class of heroes: the authors of the first Sybex books, like Rodnay Zaks; and the writers of the first computer magazines, like David Ahl, John J. Anderson, David Lubar, Philip Lemmons, and Jerry Pournelle. They were our representatives on the broader stage of world events.
The showdown that set the stage
On the absolute razor’s edge of a revolution, this ad shows an Apple computer, for the first time, being used the way that came to define Apple as a technology. From the classifieds section of Creative Computing, December 1982.
The day it became clearest to me that the world was a different place for the rest of my life, came in September of 1978. By that time, I was an adept Radio Shack TRS-80 Level II BASIC programmer, programming little demos that Tandy’s stores were using throughout the Southwest US to promote their systems. During one of the many computer conferences that defined the way we introduced ourselves to the world during this time, the big question among the attendees that fall was, whose computer played the best round of chess? I believed I knew the answer to that: “Microchess” was a program co-written by Bill Gates for the TRS-80. Meanwhile, at a display sponsored by the first truly superb computer store I ever patronized, called “High Technology,” a fellow insisted that a newly imported set of routines for the Apple, called Sargon, could dissolve this Microsoft chess program into a mass of low-resolution, white bricks.
Thus the High Technology folks and the Tandy people staged an impromptu chess match. Microchess was white, Sargon black. We agreed to set our skill levels to maximum. As a result, each move consumed as much as five minutes of “thinking” time. That day, I wore my Orbach’s-tailored blue blazer with a silk necktie, meticulously designed to make me look much, much older than I was. So as I served as the unofficial diplomat, running back and forth along the 60 yards or so of distance between each booth to report chess moves, I noticed two sets of crowds were building up, one around the Apple II, the other around the TRS-80. Because each move took five minutes, there was plenty of time for the crowds to gather.
The first game, believe it or not, ended in a stalemate. Sargon and Microchess got stuck in a rut, repeating the same series of four moves over and over again. So we had to start over, and in the second match, Sargon surprisingly won handily, in about 45 minutes. High Technology handed out T-shirts with Apple logos to celebrate.
As the crowds finally began to disperse, those who remained included a few local reporters. One asked me, what is this? Is this a video game or a computer? It looks real, whatever it is. Indicating the beige wedge, I explained it was an Apple II, and conceded it was probably among the finest computing devices ever built. “So what does it do,” I was asked, “besides play chess?” I tried a phrase that I had devised over the countless hundreds of times I’d been asked that question, usually to explain another program it ran called VisiCalc: “If you have the skill and the patience,” I said, “to provide an Apple II with the information that pertains to your business, and some of the logic that defines how your business works, it can make sense of that information so that you see it and understand it in ways you haven’t before.”
“No, I still don’t get it,” came the reply. That’s the day, and perhaps the moment, that I realized that I would be devoting a good deal of my life to providing a clear response to that question. This is how the Apple II changed me, the people around me, and the world we were just then learning to share.