It always amazes me that young people today (those in the developed world, anyway) are growing up surrounded by computers and don’t even realize how recent this is. Yet I can sympathize; I felt the same way about the space program, until I realized that the first unmanned spacecraft had been launched only a few years before I was born, and the first American in space (Alan Shepard, in a sub-orbital test drive) when I was a baby. As youngsters, we tend to assume that whatever existed in our earliest memories has always existed.
I majored in English, but dated a Computer Science major. One evening he knocked on the door of my dorm room and found me banging out a term paper on a manual typewriter. “You’re living in the Dark Ages!” he announced, and a few days later he again turned up at my door, this time with a little card that said I had an account at the computer center. He taught me word processing–very primitive–on a terminal hooked up to a mainframe. But I was probably the first English major to turn in a paper composed on a word processor. Being able to edit without retyping a page or using Wite-Out was huge.
“Vat Are You Cooking?”
One of my first jobs out of college was as a secretary at a real estate company. There were about 30 people in the office but only two computers, early IBM PC models (either the XT or AT, I can’t recall). One was on my desk and the other in an unoccupied cubicle where it was theoretically shared by everyone, yet it was rarely touched.
The owner of the company, whom I later learned had an engineering degree, would occasionally walk over when I was using the computer and ask, in his slight Eastern European accent, “Vat are you cooking?” Not sure that I understood his sense of humor, I would say something vague like, “Oh, doing the budget for one of Howard’s buildings.” He would nod silently and walk away.
One day I happened to be printing mail merged letters to some of our clients and I showed him one of the pages as it slid out of the printer. He looked at the letter, then at the computer screen. I could almost see the light bulb turn on over his head. “Ahhh!” he said. He’d finally figured out what I was “cooking” and I don’t think he ever asked again.
The Third Time is the Charm
At the real estate company we used WordStar for stand-alone word processing (it was so sophisticated compared to that mainframe in college!) or an office suite called Lotus Symphony that consisted of Lotus 1-2-3, a simple word processor, and a database built on 1-2-3.
On my next job I again was one of the few people to have a computer on my desk. It was a Wang, with a simple word processor, spreadsheet and database, which I used to maintain a mailing list database of 15,000. One day somebody came to my office and replaced the Wang with an IBM PC clone. (That night on the news I learned that Wang had declared bankruptcy; I’ve always felt a trifle guilty.) So I learned WordPerfect and Lotus 1-2-3 for MS-DOS 5.1. When someone asked if it was difficult making the switch, I said no, that once you know two word processing programs the third is easy to learn.
A few years later, the head of our department sent me to a two-day class on Quark XPress so I could design flyers in the department instead of depending on our Graphics Department, for whom we were a low priority. It was my first formal computer training, more than a decade after that first mainframe experience. I also got part-time use of a Macintosh computer for the desktop publishing, although it would occasionally be put to use for after school programs.
Software: The Corollary to Moore’s Law
I’ve mentioned Moore’s Law before in this blog and I am almost certain to mention it again. In 1965 Gordon Moore, co-founder of Intel, observed that the number of transistors that could be packed into a given area on a microchip doubled about every two years. This leads to several implications for computer hardware, but also for software. As processing and storage capacity grow, so does the complexity of computer programs. There are more features, fancier on-screen menus and toolbars, and naturally more programs can be used on a particular computer—without closing the others, once window-style interfaces became popular. Everything gets bigger.
The Re-Trained Boomer
Those of us born roughly between 1955 and 1965—around the tail end of the Baby Boom—have worked during an interesting period of transition in the workplace. Few of us were introduced to computers in high school. Most of us didn’t use them in college, either, unless we majored in computer science, mathematics, engineering or science. Liberal arts types usually avoided them and the few (like me) who dared enter the computer lab were ignored by the “hard core” computer science types.
Some time in our careers, probably around the 1990s, somebody dumped a computer on our desks and assumed we knew how to use them. A few panicked; most of us dived in and learned. Now we are in our mid-40s to mid-50s. Everyone acknowledges the experience of older workers, but the stereotype is that we are stuck in our ways. Not so! If you are in that age group, you have likely learned at least four operating systems updates in your career–and that assumes you somehow skipped alternate new releases of Windows.
For example, I went from MS-DOS 5.1 to Windows 3.x, then Windows 98, 2000, XP and now Windows 7. Plus a little Mac along the way, and a netbook that runs on Linux. (I use the last almost exclusively for Web surfing and reading PDF documents, so I can’t really say I know Linux. But I’ve seen a little.) That’s eight operating systems, including five versions of Microsoft Windows. Don’t try to tell me I’m a dinosaur! I evolve.
Thanks to AvidCareerist (Donna Svei in real life) for prompting me to put together some random notes and finally write this post!
My challenge to readers: Can you top my eight operating systems? Have you used any that I haven’t? Mention them in the Comments and we’ll see how long a list we can compile!