10 Cool Facts About The Invention of the Computer

10 Cool Facts About The Invention of the Computer

The computer defines our modern way of life.  For most of us work involves sitting behind or interacting with a computer of some sort; even children use them to do research and type up school projects.

We give little thought to the amazing computers that fuel our modern way of life (except to complain when they are too slow) but they are unbelievably fascinating pieces of equipment. Here is a list of 10 cool facts that you never knew (but really should) about the invention of the computer.

10. The word ‘Computer’ originally referred to a person not to a machine

Computer was a term original used to refer to sexy robots, oops.  I mean people.
Computer was a term original used to refer to sexy robots, oops. I mean people.

These days when we refer to a ‘computer’ everyone knows that we are talking about a machine.  When the term was originally coined, however, it meant something rather different.  In 1631 the verb ‘to compute’ was recorded as being used for the very first time.  In the verb form it means to make a calculation.  As is so often the case a verb can lead to a descriptive noun and a few years later, in 1646 we see the first recorded use of the noun ‘computer when it was used to refer to someone who is able to make mathematical calculations, or someone who was able ‘to compute’.  It was only in the 20th century that the word came gradually to be associated with an electronic computing machine.  Gradually these machines were modified to be able to do more than make mathematical computations, giving rise to the modern day computer.

9. Mechanical computing devices have been in use for thousands of years

Computers or computer like devices have existed for eons
Computers or computer like devices have existed for eons

Prior to the invention of the computer as we know it our ancestors relied on a range of alternative methods of calculation.

The ancient Babylonians used a form of abacus from around 2400BC and it became (and still is) a popular form of calculating mechanism in the Far East.  The Chinese developed a number of techniques for using the abacus which allowed them to perform complex calculations including multiplication, division, square roots and even cube roots.

Abaci are, however, relatively simple counting devices.  The oldest analog computer that has been discovered dates to 100 BC and was discovered on the Antikythera shipwreck.  Believed to date back to 100BC this Antikythera Mechanism is thought to have been used to calculate astronomical positions, a complex mechanical astrolabe that tracks the movements of the solar system.  It was the most complex machine of its time, so much so that nothing approaching that level of complexity is known about in the subsequent 1000 years.

By the 1600s people were starting to rely on the Slide Rule to assist them in making difficult calculations including logarithms, reciprocals and trigonometry.  These devices are so reliable that they were still in common use as late as the 1960s and 70s (some fields, such as aviation, still use a modified slide rule today).

All these devices were used to assist the operators to make mathematical calculations but modern computers do so much more.  Indeed the most common daily use of a computer is as a typewriter/word processor and tool for communicating on social media.  This function too has its ancient antecedents.  In 1770, more than 200 years before Twitter took the world by storm a watchmaker from Switzerland, Pierre Jaquet Droz build a doll that could be programmed to write short messages (40 letters) long using a quill and ink.

Most people, prior to the invention of the computer and calculator had to rely on a combination of a slide rule and printed mathematical tables which were often full of mistakes.  Calculations took a long time and had to be double and triple checked.

8. Charles Babbage invented the first modern computer in 1821

Small part of Babbage's mechanical calculating engine his Difference Engine an invention to which he dedicated his life.
Small part of Babbage’s mechanical calculating engine his Difference Engine an invention to which he dedicated his life.

By the 1800s the complexities of the increasing complexity of the calculations required in daily life were leading to people becoming frustrated with the limitations of tables and other available computing devices.  In around 1820 an Englishman called Charles Babbage was tasked with making some improvements to the tables used in the Nautical Almanac.  Frustrated with the length of time the process, with its requisite delays for double checking, took Babbage started to wonder whether the calculations could instead be performed by a steam driven machine.   By 1822 he had devised a mechanical calculating machine which he called a ‘difference engine’.  The government funded him to develop the engine but the project was suspended in the 1830s, by that time all of the relevant parts of the calculating mechanism had been made but it had not been assembled.  If it had been it would have weighed more than 2 tons!

In the process of researching his differential engine Babbage realized that the principals could be applied to much more than simple navigational computations and went on to design his analytical engine.  This machine was designed to use the punch cards then commonly in use on weaving looms (and similar to those used in the early computers of the 1900s).  It had a number of features including a logic unit, conditional branching and integrated memory that make this machine the first ever modern computer.  The machine even had its own printer.  Nothing like it would be developed for another century; it was ahead of its time and, indeed, the development was hampered by the fact that all the parts had to be made by hand.

7. The concept of what was to become the modern computer was defined in 1936

Alan Turing, genius.
Alan Turing, genius.

Guy Erwood / Shutterstock.com

In 1936 Alan Turing developed the idea of the modern computer by introducing the concept of the ‘universal machine’ that could compute anything that is computable and was capable of being fully programmed.

The US Navy was interested in using computers to calculate the trajectory of torpedoes fired from submarines in order to make them more accurate.  They developed the Torpedo Data Computer.  This integrated firing system allowed the torpedoes to track and aim at their target on a continuous basis and was more sophisticated than any other firing system at the time.  It relied on a system of electrical switches to drive mechanical relays.

In Germany computer pioneer Konrad Zuse developed computers that relied on the binary system of calculation (Babbage’s computers had operated on a decimal system).  His computers were the first to be fully programmable and had separate storage.  His first computers were destroyed in allied bombing raids on Berlin but Zuse was undeterred.  In 1945 he developed Plankalkul, the first programming language and in 1946 he founded the world’s first computer company, raising funding through an IBM option on his patents.

6. World War II was a major catalyst for computer development

Electronic Numerical Integrator and Computer
Electronic Numerical Integrator and Computer

One of the biggest challenges of the War was to crack the German ‘Enigma’ codes that directed the U Boat operations against the allies.  British operation at the secure Bletchley Park famously cracked the codes.  Once this had been done thought turned to the possibility of cracking the even more sophisticated ‘Tunny communications that carried high level army intelligence.  Tommy Flowers, who had been instrumental in converting telephone exchanges to electronic networks, was brought in to construct a machine that could run the decryption.

Flowers designed Colossus which was the first electronic digital programmable computer in the world.  It was moved to Bletchley Park in 1943 and immediately set to work on breaking the encryptions.

On the other side of the Atlantic the first programmable computer built in the US was the ENIAC or Electronic Numerical Integrator and Computer.  The programming was done by setting the cables and switches in a defined manner, once that program was complete the cables had to be reset.  The machine was used to program ballistics trajectories, it weighed in excess of 30 tons and used over 18,000 vacuum tubes.

Both machines were kept very secret during the war, at the cessation of hostilities, however, the existence of ENIAC was made public while Colossus was kept secret, (presumably to protect British encrypted messages).  Flowers was unable to garner any public acclaim for his much more sophisticated machine which had been in operation 2 years before the US version.  All but two of the machines were destroyed with the remaining Colossi being used for training.

Colossus and ENIAC remain important in the history of computing, not just because of their contribution to the allied victory against tyranny but also because they established, once and for all, that large scale computers were not only possible but practical.

5. Microprocessors were key to the development of the modern computer

Intel 4004 Microprocessor
Intel 4004 Microprocessor
Wikimedia Commons: By LucaDetomi at it.wikipedia (Transfered from it.wikipedia) [GFDL (www.gnu.org/copyleft/fdl.html) or CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0/)],

The early machines such as Colossus and ENIAC were limited by the fact that they had to be completely rewired whenever a new program was run.  By the late 40s and early 50s developers were working on a new concept, the stored program computer, that would eliminate the need to rewire each time.  The first computer with RAM storage, Baby, started work as a test project in 1948 and by 1951 the first general purpose computers, the Ferranti Mark 1, were at work.

These computers were still very large due to the need to use vacuum tubes, from 1955 on these tubes were replaced with transistors leading to a reduction in computer size and heat output.  These were, in their turn, replaced with integrated semiconductor circuits.  These were the direct antecedents of the microprocessor which made modern computers possible.  The first commercially available microprocessor was the Intel 4004 which was launched in 1971, it used the new silicone based technology to miniaturize processing to a far greater extent than ever previously thought possible.  These chips were sold as building blocks which could be used to customize the computers to the client’s specification; it was mostly used in calculators and could access 4KB of memory.  This seems miniscule by today’s standards but it was revolutionary for the time.  .

4. The first hard drives were bigger than fridges!

The first computers would give this stainless behemoth a run for its money in the size department.
The first computers would give this stainless behemoth a run for its money in the size department.

Storage has always been an issue for those using computers – in fact for many of us, storage is what drives our need to upgrade, whether it is our PC, our laptops, tablets or telephones.  The original computers stored their data on large drums, these were rapidly replaced with disk storage which arrived on the scene in the 1950s.  They were the size of two large fridges and held 5mb of data on 50 separate 24 inch disks.

In 1980 there was a step change and the precursor of the hard drive as we know it today was developed.  Seagate (still leaders in the field of computer storage) developed a 5 mb hard disk drive which could be used with microcomputers.  Over time these devices grew smaller and smaller and by 1998 it was possible to store 340mb on just 1 inch worth of disk.  From this date the pace of change became exponential, by 2004 a 0.85 inch drive could store 2 GB of data and by 2015 manufacturers are talking about terabytes per square inch!

It is not just hard disks that have undergone a change, removable storage.  In the 1970s the floppy disk had arrived, the first ones were huge at 8 inches (and really were floppy).  They were the primary means of distributing programs for use on computers and for storing documents.  In subsequent years floppy disks gave way to the smaller 3.5 inch rigid disks (still called floppies) that could fit, conveniently, in a shirt pocket and then later with CD Roms.

These days programs are typically downloaded over the internet and documents are stored either in the cloud or on USB drives.  For those who still want access to the contents of their floppies, however, plug in ports complete with modern USB connections are available.

3. The famous Ctrl/Alt/Delete shortcut is a mistake!

Control alt delete was a mistake.  But it's so handy.
Control alt delete was a mistake. But it’s so handy.

Everyone knows that the first thing to do when a computer crashes is to press Control, Alt and Delete simultaneously.  So popular has this maneuver become that it even has its own name, the ‘three fingered salute’.

This handy shortcut was, however, never meant to be released to the public at large.  It was designed as a ‘soft reboot’ shortcut by IBM developer David Bradley who had become fed up with the length of time it was taking to reboot a computer with full memory retests every time there was a small glitch.  The function was originally designed to use the control/alt and escape keys but because these could all be pressed by the left hand it was too easy to restart the computer by accident.  The function was therefore switched to the now common control/alt/delete which needs two hands to implement.

This function was meant to be purely for internal use only, it was never intended to be published.  It was included, by accident, in an IBM technical manual.  The shortcut found a popular use, however, when Microsoft Windows became the operating system of choice.  The early versions were buggy and would crash easily, gradually people became aware that the control/alt/delete shortcut could be used to save time and frustration when rebooting the system.  The short cut works on almost all operating systems (Mac being the exception).

2. Viruses have been around for almost as long as computers

Computer viruses have been around for a long, long time.
Computer viruses have been around for a long, long time.

The sad fact is that where there are computers there will be computer viruses.  These programs are designed to travel between computers (via the internet or infected storage media) and force the system to grind to a halt by using up all available space.  Some are even able to delete material already stored on the computer.

The concept of the computer virus was discussed as early as 1944 and by the 1960s a virus called ‘The Creeper’ was infecting the ARPANET (see below).  As computers became more affordable and proliferated in private homes viruses became more damaging, one of the first to come to public attention was ‘The Elk Cloner’ which was spread through floppy disks.  Designed by a 15 year old this virus was more of a prank than a harmful piece of code but it was a sign of things to come.  BY the 1990s with the majority of computers running on the Windows system viruses were able to exploit the weaknesses in the system leading to the need for companies to develop effective safeguards.

Unfortunately viruses will always find a way and as protections and antivirus scans on disks and CDs became routine viruses moved to email as the best way to infect new computer.  These days we know not to click on links that look suspect but it was not always so.  The first major email virus was known as Melissa and hit computers in 1999 spreading through infected Word attachments. The virus emailed itself to the first 50 contacts in an infected computer’s email address book.  Each new computer would infect 50 more meaning that the virus could grow very quickly in a short space of time.  Some companies had to shut down their email systems to prevent it spreading.

Viruses are still out there but we now have better antivirus protection and are savvier about what links we open so they tend not to cause the problems they once did.

1. The Internet was conceived as early as 1950

In the 1950s people wanted instagram.  That can't be right.
In the 1950s people wanted instagram. That can’t be right.

It is hard, these days, to imagine life before the internet.  Once we had to look up facts in a physical encyclopedia (or on our computer based Encyclopedia on CD Rom for those of us who are younger).  These days information on just about anything is available at the tap of a finger.  The internet as we know it may seem to be a relatively recent phenomenon but it has, in actuality, been around almost as long as computers themselves.

In the early days of computing it was discovered that computers could be networked to coordinate information, this was used to develop the US based SAGE air defense system in the 1950s, it became operational in 1963.  By 1964 commercial equivalents such as the SABRE air travel reservation system were in operation.

In the 1970s the limitations of single computer led to technicians networking 4 separate computers together.  They were based in UCLA, Stanford, Santa Barbara and Utah.  The project was called ARPANET and allowed email and file transfers to take place.  Most of the early uses for this network were defense based but more and more computers joined and the military developed their own network, MILnet in 1983.  More and more universities started to develop their own networks (called Local Area Networks or LANS) which started to connect with other LANS on information sharing networks. By 1986 the internet as we know it was on its way and ARPANET finally shut down in 1990 just after the creation of the World Wide Web (www).  The internet grew quicker than almost any other technological advance in the history of mankind going from just 150 computers at the beginning of the 1980s to over 200,000 at the end of the decade and 8.7 billion (more than the human population of Earth in 2012..


The computer has change beyond recognition in less than a lifetime.  In the 60s and 70s card slot computers took up whole rooms and employees at businesses lucky enough to have them had to book time to do their calculations.  Even the pocket calculator, which seems to us today to be unbelievably basic and which is now nothing more than an app on a cheap ‘phone, was beyond the reach of the average worker.  Most people who did calculations had to use slide rules and word processing did not exist, hand or typewritten documents were the norm.

Fast forward to today and it seems inconceivable that we once lived like that. Almost unbelievably, everyone with a smartphone has access to processors more powerful than the computers that sent man to the moon and we use these mini supercomputers not for remarkably complex feat of calculation but to browse the internet, connect on Facebook and play Candy Crush!  Back in the 60s people would watch science fiction films such as Star Trek and marvel at the ‘tricorders’ and communicators.  Now our technology looks sleeker and is more efficient.  We are truly living in a wondrous age.