It seems like nowadays we use computers for nearly everything: working, shopping, banking, communicating, learning new things, playing games, listening to music, watching videos, and more. It’s hard to even imagine life without our modern hardware and digital services!
Yes, the modern world is fantastic, but have you ever stopped to think about where all this amazing technology came from? Who invented it all? Well, behind every great IT product or service, there is normally a person or small team who turned their dreams into reality.
We've all heard of Bill Gates, right? He co-founded Microsoft and is one of the richest men in history. Equally famous was Steve Jobs, the person who, along with Steve Wozniak, started Apple Computers way back in 1976. However, there are many other IT pioneers who aren't as well known but who deserve recognition for the work they did in advancing the world of computing.
Let’s start by going way back to the Industrial Revolution in 1822. The very first person to ever design a working computer was the legendary Englishman Charles Babbage. His dream machine was humbly called The Difference Engine. It was never actually built during his lifetime, but the plans were eventually implemented into a working machine in 1991. You can spot it if you visit the Science Museum in Kensington, London.
Babbage’s Difference Engine was indeed very inspirational but still way beyond what most people at the time had the capacity to understand. In fact it took Ada Lovelace, the daughter of a poet and mathematician, to see the full potential of the project. She actually wrote software for the machine 170 years before it was even built! That’s incredible! So it’s no surprise that Ms. Lovelace is widely regarded as the “world’s first programmer” as well as the first person to envision computers as more than mere number-crunching devices.
Speaking of crunching numbers, there is a popular meme out there that today’s computers only think in “ones and zeros”. Perhaps some of you may have even heard that these are called boolean values. But where did this name come from? It was from none other than Englishman George Boole who came up with the binary base-2 counting system in 1847. It may sound primitive or even impractical to count using only two numbers, but this “digital” logic is still the basis of all modern computer processes.
Skipping 100 years into the future we have the first documented vision for something similar to the World Wide Web. This was when Vannevar Bush first proposed his 'memex', which laid the foundations for modern 'hypertext'.
Another notable figure in mid-20th century computing was Alan Mathison Turing, a British genius known as the "father of computer science". He invented the famous Turing Test, which is very influential in Artificial Intelligence as a way to determine a computer system from a human being. He was also instrumental in breaking Nazi communication codes that helped the Allied forces win World War II.
Edgar Frank Codd, another influential Briton, is known for inventing the "relational" model for databases, which is still in widespread use today. Relational databases are a great example of elegant design — both extremely powerful and simple to learn.
Simple may be good for databases, but as other types of software applications became more complex, programmers needed more expressive ways to instruct computers what to do. One of the first pioneers in this regard was Grace Hopper, an American woman credited with creating the first “high-level” programming language called FLOW-MATIC in 1955. FORTRAN, another early high-level programming language, was invented by American John Warner Backus in 1957.
The 1960’s will forever be remembered as the decade that America “put a man on the moon”. But at the end of the day, it was a woman who made this possible. Because in 1963, NASA hired American engineer Margaret Hamilton as lead developer for the Apollo rocket flight software. Hamilton’s program, comprising over 600,000 lines of code, has been credited with saving the moon mission from at least one fatal disaster.
In the late 1960’s the microprocessor was invented and soon went into mass production. Now computers weren’t just for Fortune 500 companies anymore. This innovation was made possible by a company called Intel, developed in 1968 by Americans Robert Noyce and Gordon Moore. Mr. Moore became especially famous for his “Moore's Law”, a term he coined to predict the rapid increase of computer technology over time.
Shortly after the microprocessor was invented came Americans Dennis Ritchie and Ken Thompson. Working together at Bell Labs they invented the C programming language and the original UNIX operating system. These two monumental projects have served as either the foundation or inspiration for nearly every single software project since. Today, it’s actually pretty hard to name a current IT system which uses neither a C-inspired programming language nor a UNIX-derived operating system.
But not everyone needs to know C or UNIX to succeed in an IT career. All most people need is an easy to use, structured, and more “general'' programming language that runs on any OS. In the late 1980’s the Dutchman Guido van Rossum invented Python. You may have heard of it. Python was both easy to learn, pretty to read, and powerful enough to be used in nearly all situations. And it even had a cool name, inspired by both a snake and the Monty Python comedy troupe! If you only learn one programming language in life, it should be this one.
Moving on to the early 1990’s we have two more incredible innovators. First up is British man Tim Berners-Lee who started to see the success of his World Wide Web project while working at the CERN supercollider in Switzerland. Then we have Finland’s Linus Torvalds who pulled off what many software engineers before him had tried and failed to do — create a free version of Unix that ran on commodity PC hardware. As much as anyone in the history of IT, these two guys are superheroes. Arguably, their principled “open source” software saved the IT world from potential monopolists such as AOL and Microsoft.
Building on the success of these two seminal projects, in 1994 a Danish man named Rasmus Lerdorf invented the PHP programming language. He was seeking an easy way to add more interactivity to web servers. Soon after, American Brendon Eich invented JavaScript to create interactivity on the web browser side. Since that time these two languages have evolved to power the vast majority of websites.
Now PHP and JavaScript are great languages for web applications, but you can’t exactly run a government or even a large corporation on them. Those tasks are better left to multi-threaded enterprise languages that run inside a virtual machine. So in 1995, Canadian James Gosling created Java, and the rest is history. Java and its copycat projects (such as C# and many more) are used literally everywhere where systems are considered to be both large and mission critical.
So far we’ve mentioned a lot of North Americans and Europeans, but of course there have been plenty of influential people from all over the world. Meet the creator of a PERSONAL FAVORITE programming language — Ruby! While not as popular as some other projects mentioned above, some find it to be more expressive and beautiful to code. Ruby is the work of Japanese computer scientist Yukihiro Matsumoto whose philosophy is that “happy programmers are productive programmers”. Definitely check out Ruby and the Web framework Ruby on Rails if you get the chance. You won’t regret it.
Well, that’s all folks! Remember, this is still a very short and incomplete list. Apologies to the people who were not included. But now you can see that through creativity and hard work, a relatively small group of IT innovators have contributed to and shaped our modern world.
So every time you boot up a computer, play a video game, or scour the latest notifications on your smartphone, try to remember the individuals who made these wonders possible.
This reading was originally written and submitted by Dan Rieb. It was revised in mid-2022 by Larry Zoumas. Special thanks to Audrey Graffagnino who inspired the recent rewrite.
- Pick any person from this reading and write a short essay on their life. How do they inspire you?
- Will you ever be considered a famous or great IT person? Write why or why not.
- Pick any IT person NOT mentioned in this reading and write a short essay on their life. How do they inspire you?