Recently, I was looking for something, rummaging through some storage boxes in the basement. Before I found what I was looking for, I ran across an old work folder from the ‘80s. One of the items in the folder caught my eye, an internal corporate newsletter, dated 4th Quarter, 1989. In it was an opinion article written by me, which I had long since forgotten.
At the time, I was the manager of an internal computer consulting department for The Coca-Cola Company in Atlanta. In those days, computers were still being deployed to desktops. We had not yet reached the point where a computer was considered an essential part of every professional employee job role. The department’s mission was to promote and assist in increasing employee and company productivity through increased use of computers. The newsletter was actually produced by my department and focused on computer topics for the edification of employees.
The article began with a perspective on the amazing growth of computer technology in the ‘80s. Then, it looked toward the new millennium – only 10 years away. What might it have in store for us? So, in the rest of the article I made some predictions regarding what we might expect in the way of computing technology advancements by the year 2000.
In re-reading the article I found myself drifting into a somewhat retrospective state and I began to reflect back to the beginning of my career interest in computers. That took me all the way back to Cow Year at West Point. One of my courses that year was titled “Computer Science Fundamentals”.

I think it was an elective course, but it may have been one of the core courses that we all had to take, can’t remember. It basically consisted of learning FORTRAN language programming, although the Academy referred to it as “CADETRAN”. Anyway, I really liked it and found that I had some talent for it. I made a mental note that the computer field might be something to pursue after an Army career.

My next encounter with computing technology was in 1970, at Fort Carson, Co., during my first Field Artillery assignment. Our 8-inch self-propelled howitzer unit was involved in field testing “FADAC” (Field Artillery Digital Automatic Computer). Classmate cannoneers reading this may also have encountered it. FADAC was a metal, olive drab box, containing a specialized computer, with a built-in keyboard and small screen. Its’ purpose was to calculate artillery fire direction values such as azimuth, charge, elevation, etc., based on feedback from the Forward Observer. In our tests we ran FADAC in parallel with our manual calculations. I recall that in just about every case the manual calculations team completed and had new fire direction values ready well ahead of the FADAC team. In those early days of computing technology capability and speed, I guess FADAC was not ready for prime time.

In 1974, at the end of my service obligation, I decided to change direction from an Army career and join the civilian world. Pondering “what next?”, I thought back to that “CADETRAN” course and decided to begin work toward an MS Computer Science degree at Virginia Tech. This was at a time when computing technology was still quite primitive. There weren’t any PC’s, the internet was years away, and no one had a cell phone. Most computers were behemoths that filled large room(s). Students still had to use punch cards to read programs into the computer. Remember those?
Degree completion took me into my corporate career in information technology. After a few years at Shell Oil, I moved on to Coca-Cola, where I spent the majority of my career. After 10 years at Coca-Cola, that brings us up to 1989 and that newsletter article. So, what did I predict in 1989 for the state of computing technology in year 2000?
- A graphic user interface (Windows, MAC) becomes the universal PC standard (at the time there were numerous PC’s still using a DOS interface).
- Standalone PC’s (which was mostly the case then) become fully networked.
- Fingerprint scanning becomes the PC identity and logon standard.
- Voice commands become the standard for PC control.
- Email use explodes and becomes ubiquitous (email was very limited in 1989 and mostly on internal company mainframes).
- Video files will be watchable on a PC and sent to others via email.
- The rolodex becomes obsolete, along with significant reduction in paper filing.

How’d I do? Blew #’s 3 & 4 but did reasonably well on the rest. However, even those two were still correct in one sense. They occurred a decade or so later than 2000 and for a different device – the cell phone. Both of those capabilities are now in use on cell phones, but have never become commonplace on PC’s.

There are numerous other key computing technology achievements in the last 30 years that I would not have even conceived of in 1989.
A few examples:
- The Internet
- Cell phones as computers, cameras, GPS units
- The degree of advancement in AI sophistication
- Google, Facebook, Instagram, Twitter, YouTube, Amazon, Map routing and a myriad of other useful applications
- Flat monitors
- Thumb drives
- Broadband everywhere
Now, we’re in the 4th quarter of 2019, so, what do I predict for 2030 and beyond? Nope, not going there! Let me share 3 historical predictions by noted experts that were spectacularly wrong:
- “There is no reason for any individual to have a computer in his home” – Ken Olson (Founder, Digital Equipment Corporation) – 1977
- “I predict the Internet will soon go spectacularly supernova and in 1996 catastrophically collapse” – Robert Metcalf (Inventor of Ethernet) – 1995
- “There is no chance that the IPhone is going to get any significant market share. No chance.” – Steve Ballmer (Former Microsoft CEO) – 2007
If the CEO of the world’s premier software company can botch a prediction that badly, I think I’ll just rest on the partial success of my 1989 predictions. Plus, I always keep in mind Yogi Berra’s wisdom, when he once said: “Predictions are hard, especially when they’re about the future”.