When we think of modern computing, we inevitably think of how fast technology moves these days. It seems we can barely buy a new laptop or smartphone and get it out of the box before it starts to feel obsolete. New features, speed increases, more storage—it all seems to happen so fast.
That’s why it can be surprising when you stop to take a look at some of the tech that we use every day and realize just how old some of it is. From ancient operating systems and programming languages to network protocols developed decades ago, some parts of our tech-heavy world have been around for decades now. And they show no signs of going away anytime soon. Here’s a look at ten examples of very old computing technology that are still widely used today.
The OS from Half a Century Ago
“It’s a Unix system! I know this!” says young Lex as she saves the day in the 1993 film adaptation of Jurassic Park. This line became one of the earliest internet memes, and it has endured; there’s even a whole subreddit devoted to it. The line resonated so strongly because many computer professionals can relate: If you know the Unix operating system, you can sit down at any Unix-like system made in the last 50-plus years and instantly feel at home.
Unix originated at AT&T’s Bell Labs in 1969. Designed from the ground up to be a multitasking and multiuser system (i.e., with the ability to do multiple things for multiple logged-in users all at once), Unix has long been hailed for its innovative design and rock-solid stability. But perhaps the biggest reason users are so loyal is the “Unix philosophy,” a guiding set of design principles that encourages the use of small, useful applications that can easily pipe data to other applications.
While AT&T sold Unix licenses for many years, the core concepts led to the development of many Unix-like systems over the years. Today, developers can submit their operating systems for certification as a “UNIX-Certified Product” to the current owner of the Unix trademark, the Open Group.
In the world of free and open-source software, the most popular operating systems are Linux distributions, with Linux being categorized as a Unix-like system. Linux powers many of the servers on the internet today and has made major inroads as a desktop OS too. Considering its age, it’s pretty incredible to consider how long Unix and Unix-like systems have been around—and how relevant they still are today.[1]
The Ancient Programming Language That Banks Still Run On
When it comes to programming these days, you’re likely to see a lot of references to languages like Go, Rust, and C#. But there’s a programming language that’s been in heavy usage since its debut in 1959 and continues to be the backbone for global finance.
COBOL came about when a group of businesses and the United States government saw the need for a common language that could run on the competing mainframes of the day, with an easily understood English-like syntax. Once the language was complete late in 1959, it was immediately embraced by banks, brokerages, and government agencies like the IRS.
Despite the tech industry’s tendency to embrace the “latest and greatest,” COBOL has remained the de facto standard in financial industries. At the same time, there have been shortages in COBOL programmers for years now, as young coders tend to learn and specialize in newer languages. Plans by banks and government agencies to migrate away from COBOL continue to be put on hold due to the cost and complexity involved in retiring legacy systems. That means our financial systems still run on a language over 60 years old now.[2]
The Very Popular and Very Old Coding Tool
While the average computer user will write text in a word processor, programmers work with a plain text editor. Put simply, plain text is not formatted with the niceties we see in word processors, like multiple fonts, text justification, and formatting. Since computers read code written in plain text, coders need a good editor that allows them to write and edit plain text efficiently.
Many of the popular plain text editors today are actually IDEs (integrated development environments), which help you keep track of all the files in your codebase and revisions in that code. Microsoft’s Visual Studio Code is the most popular IDE today, as it routinely tops developer surveys. But the minimalist (yet powerful) editor Vim is still a popular choice today among coders, which is pretty incredible considering its age.
Vim itself was released in 1991, but its lineage goes back way farther. The Unix app vi (short for “visual”) debuted in 1979 and was itself a newer version of an older tool. A couple of decades after vi, Vim appeared, with its name originally meaning “vi imitation,” but now meaning “vi improved.”
A quick look at Vim may scare off newcomers, as it literally looks like a display of text with no menus or controls. But what makes it so popular with programmers is its modes: Insert mode allows you to insert text, while Normal lets you run commands on your text. Normal mode is the secret weapon, as it enables quick copying, pasting, and other text manipulation without your fingers ever leaving the keyboard. It’s this speed and power that has kept Vim popular, even though its lineage goes all the way back to the creation of Unix in 1969.[3]
A Steve Jobs Failure, Reborn as a Success
In what has become the stuff of business legend, Steve Jobs was forced out of Apple in 1985 after a boardroom showdown with John Sculley. Jobs then took $12 million of his own money and founded a new computer company called NeXT. In 1989, NeXT rolled out its first product, the NeXT Cube, an immaculately designed yet very expensive workstation computer. Priced out of the range of most of the universities and researchers that Jobs thought would be his target market and way past the budget of the home computer user, NeXT would go down in history as one of the most high-profile failures in the industry.
However, those who did buy a NeXT computer during the company’s brief existence had nothing but great things to say about the company’s operating system, NeXTSTEP. Built on top of a Unix core, NeXTSTEP was powerful, flexible, and stable in a way that other operating systems of its day were not. When Apple found themselves needing a revamped operating system for their Mac line, they purchased NeXT in 1997 for $429 million. For that price, Apple got the rights to NeXTSTEP and brought Jobs back into the company.
Apple’s rise after that to become one of the most successful companies in the world is well-documented. But what is often overlooked is the role of NeXTSTEP in that success. It was first retooled as Mac OS X for desktops and laptops, but it is also the basis for iOS on iPhones, the iPad OS, and even TV OS on Apple TV boxes. Although it goes by different names now, the 30-plus years of NeXTSTEP make it one of the oldest operating systems still in active development today.[4]
A Standard for Downloading and Sharing Files
If you’ve spent any time downloading files from the internet or sharing them with others, you’ve run across ZIP files. But what are they? In technical terms, ZIP is a compression format, which means it takes an existing file and makes it smaller. Once the ZIP file reaches its destination, it can be decompressed to return the file to its original state. This not only saves space but also helps files to transfer across networks quicker, avoiding corruption in the process. There are many other compression formats, but ZIP has outlasted them all, which is pretty incredible considering that the format is in its fourth decade of use.
Created by programmer Phil Katz at his company PKware in 1989, the ZIP file format predates the modern internet. Given the very high price per megabyte of hard drives in the 1980s, ZIP was one of many compression tools to come out at the time. But its ease of use, and its later ubiquity across nearly all computing platforms, have made ZIP the standard for file compression since then.
Another aspect of ZIP’s long lifespan is its usefulness in general file handling. Microsoft’s standard Office file formats (for example, DOCX for Word and XLSX for Excel) are actually ZIP files underneath the hood. This allows Microsoft to essentially combine several different files into what appears as one file, allowing for compatibility with other office applications.[5]
The Big Computers of Yesteryear
When thinking of the history of computers, it’s easy to recall that they used to be giant mainframes that filled whole rooms but now have eventually shrunk to desktop computers and handheld devices. But the truth is, mainframes are still with us today and still perform critical business functions for companies all over the world. A 2021 survey showed that 67 of the Fortune 100 still use mainframes.
Mainframes got their name from the cabinets on large computers that hold the CPU and main memory—the “main frame.” While today’s mainframes share the same construction and size as their mid-20th century counterparts, the computing power has definitely increased over time. What hasn’t changed is the reliability in processing many transactions per second that mainframes are known for.
And in today’s computing landscape, mainframes are learning new tricks. In addition to running legacy systems like COBOL applications, modern mainframes provide the backbone for cloud computing and running virtual machines simultaneously. Not just a relic of the past, mainframes are actually a key component of our tech-oriented world today.[6]
The Peripheral That Won’t Go Away
It’s hard to imagine computing without some sort of keyboard, whether for typing text or issuing instructions to the computer. And at this point, it should also be assumed that the computer mouse is here to stay too.
The first prototype mouse was created in 1964 by Douglas Engelbart, who was then a Director of the Augmentation Research Center at Stanford Research Institute in Menlo Park, California. But the mouse’s entry into mainstream culture can really be traced to 1979 when a group of Apple engineers and executives led by Steve Jobs visited the Xerox Palo Alto Research Center (PARC).
It was on this trip that Jobs first saw computers with icons, windows, a mouse, and other technologies that had been developed at PARC. Convinced (quite correctly) that this was the future of personal computing, Jobs took this information back to Apple. By 1983, Apple had shipped its first computer with a mouse, the Lisa, followed by the first Macintosh in 1984.
Since then, the mouse has become an essential part of personal computing. Not bad for a piece of hardware designed in the 1960s and, save for some technical improvements and ergonomic changes, is more or less the same here in the 21st century.[7]
Modern Networking Is Really Old
Another innovation developed at Xerox PARC—which Steve Jobs admitted that he totally overlooked during his 1979 visit—was a workgroup of personal computers networked together, providing the ability to share files and resources like networked printers. It’s something we take for granted today, especially considering the giant worldwide network that is “The Internet.” But it was all made possible thanks to Ethernet.
Bob Metcalfe invented Ethernet in 1973 at Xerox PARC. The company patented it in 1975, and it was later made an open standard. For wired network connections, Ethernet is the absolute standard, but it’s not because there haven’t been other options over the years. Competing connectivity options like Token Ring, FDDI, and Apple’s LocalTalk all competed with Ethernet at one point or another over the last five decades. Yet Ethernet has remained the standard.
But what about WiFi? It was actually created as a wireless variant of Ethernet, and its official name as a recognized standard is, in fact, “wireless Ethernet 802.11.” So while Ethernet has gotten faster and gone wireless over time, it’s essentially the same concept today as what Metcalfe came up with in the 1970s.[8]
The Internet Protocol Predates the Internet
You’ve probably seen your computer’s TCP/IP settings at some point, but what is it? Suffice to say, it’s complicated and best left to qualified network engineers. But from a 20,000-foot view, TCP is the Transmission Control Program, which regulates how data travels over the internet. The Internet Protocol defines your address on the ‘net and how the data is routed to you. What may be most interesting about these two is that they were developed years in advance of the dawn of the public internet in the 1990s.
The precursor to the public internet was the ARPANET, a creation of the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. And it turns out TCP/IP was developed and refined there over the 1970s and 1980s, with two men—Robert E. Kahn and Vinton Cerf—being credited as forefathers of the protocols. In reality, it was very much of a work in progress, with TCP first replacing earlier protocols in 1974. Kahn and Cerf later realized the growth of the network was not feasible without breaking out the addressing and routing into a separate protocol. Thus in 1983, the Internet Protocol was created. TCP/IP as we know it today was now in place.
If it weren’t for the legwork done on the ARPANET, the internet would not have been ready for the public in the 1990s. The work of Kahn and Cerf may be positively ancient in technology terms today, but thankfully it has proven robust enough to scale up to the massive global network we enjoy today.[9]
Email Is as Old as Networked Computing
Even though a lot of us dread looking at our email inboxes these days since it is likely full of spam, promotional offers, and more work to do, it is still an essential part of daily computing. If you have no love for email these days, try and imagine how exciting it must have been in its early days, when sending a message across computers seemed like something from the future.
Not surprisingly, ARPANET was the network that facilitated the first email delivery. On October 29, 1969, UCLA professor Leonard Kleinrock and his student and programmer Charley Kline sat down to send a message over ARPANET to another programmer, Bill Duvall, at Stanford Research Institute. The message was to be one word: “login.” And the system crashed right after the “o” was typed!
Thankfully, the message was able to be sent successfully about an hour later, and that was the birth of email. Was that a good thing? That’s for each person to decide, but considering the longevity of email and the many billions of messages that have been sent since 1969, it’s definitely worth noting that one of the oldest computer technologies is still with us as a part of everyday life.[10]