We spend so much of our lives these days on computers that it becomes easy to forget what remarkable pieces of technology they are. In an age when almost every part of life is increasingly run or aided by computers, it is surprising how few of us know the fascinating history behind them. After all, many of us spend our days working on personal devices, yet computers are also at the core of everything from car engines and cash registers to toilets and alarm clocks.
Spanning the length of the 20th century, with roots as far back as the industrial revolution and beyond, the story of the computer is also the story of human practices over time. To examine how our lives came to be so entirely computer-controlled is to simultaneously examine how humans have developed into the diverse and hyperconnected species we are today.
The word computer originally referred to a person who does calculations. According to the Oxford English Dictionary, the first mention of the word in print was in 1613. Early computers were simple machines that counted or performed other calculations. An abacus, for example, is a type of computer, meaning that many of us actually played with computers as infants long before we ever had access to a desktop, laptop, or tablet!
However, there are many other examples of early computers besides the abacus: tally sticks and clay spheres have been found from prehistoric times, which were probably used to count things like the number of livestock or amounts of grain. The Antikythera mechanism, which was found in Greece in 1900 in the wreck of a ship that probably sank around 65 BC, contains a series of small gears which archaeologists believe were to calculate the positions of the planets.
Based on these early computing objects, it would seem that the need to calculate such information has always been crucial to our ability to run our civilizations. It makes sense that as our social systems became more complex, so did our computing devices. The first proper computers were analog computers that made physical models of specific computing problems to solve them, like the tide-predicting machine invented by Lord Kelvin in 1872. The first digital computers made their appearance in the early 20th century, a development that would change the way we understand the world forever.
The Modern Computer
The first devices that we would recognize as computers as we think of them now were electromechanical (they used electricity to move mechanical calculating pieces), but by 1941 the first all-electric computer, named the Z3, was invented. With this first fully automatic computing machine came the first example of programming, which is the way in which a human gives the computer a set of instructions for calculation.
While the Z3 and other early computers had fixed programs that were punched into film, meaning a human would have to intervene every time a new program needed to be used, Alan Turing developed the theory and then helped build the first computer that could remember its own programs. This computer was called the Manchester Baby. Though it only had seven instructions, the Baby paved the way for computers that could remember and hold countless sets of instructions or programs, supporting complex software for different uses.
Hardware and Software
Early attempts to make computers were mostly concerned with the tangible physical parts of the computer: the bits that would make it run. But after the advent of the stored-program computer came about in the form of the Manchester Baby, the hardware technology began developing rapidly. As the hardware became increasingly functional (and increasingly small), there was a lot more freedom for individuals who wanted to design how, exactly, the computer did its computing work. Much of the focus for computer designers shifted to be about operating systems or software.
Different software emerged according to the tastes and beliefs of those individuals: programming languages reflected how different folk thought computers should run. The most familiar of these individuals are probably Bill Gates and Steve Jobs, who created Microsoft and Apple, respectively. Both of these men benefitted hugely from access to computers at a time when they were not widely available. Gates and Jobs worked on computers at their schools in order to become familiar with the hardware of these early machines and to learn and master the early languages of programming. Each went on to use that knowledge to design wildly popular software which would shape the experience of the computer for a regular user. The software they created—recognizable as the ubiquitous brands Microsoft and Apple—went on to change the way computers would work as a part of daily life.
Programming and the Future
Apple and Microsoft remain the dominant software brands on the market, but those who wish to understand software or hardware no longer need the incredible luck of Bill Gates and Steve Jobs to learn. Many indie programmers and developers have emerged who are challenging the design of the computer as we know it with open-source software and operating systems. After all, anyone with an interest in computers can find programming courses online to gain a basic understanding of software design in a few simple lessons.
But there are also more in-depth courses that can train individuals to be full-time computer programmers. Since computers are so ubiquitous, it makes sense that there is an ever-increasing need for people who can design and program them. Though these machines remain complex and complicated, completing a masters in software engineering online has never been easier. So long as a person has an interest and access to the internet, they can soon obtain the required skills to influence the development of the computer in the future.
What will the future of computers be? Since the 2010s, mobile computers (tablets, smartphones, etc.) have been increasingly popular, and many believe these represent the future of computer development. Certainly, the design of ever-smaller computers will help facilitate an even wider range of ‘smart’ technologies. There is no doubt that the computer will continue to play an even greater role in our development as a species as we continue to progress and connect.