Do we really need to tell you about the impact the computer has had on modern society?
No other invention changed our world like the computer. Today, we’re going to tell you where the computer came from and explain where it could be going in the future.
Early Foundations for the Computer
The computer is a relatively new invention. Even the most basic computers didn’t appear until the 20th century.
However, the idea for a computer has its roots way back in ancient times. For thousands of years, humans have sought to use mechanical processes to quantify, understand, and compute the world around themselves.
In one timeline of computer history, the writer starts way back in 50,000 BCE, for example, which is when “the first evidence of counting” appeared.
Other notable ancient-era improvements that led to the modern computer included the following timeline events:
- 3400 BCE: Egyptians develop a number 10-based counting system, which made it easier to count large numbers.
- 2600 BCE: Ancient Chinese introduce the abacus.
- 1350 BCE: The Ancient Chinese introduce the first decimal.
- 300 BCE: The Romans introduce the Salamis Tablet, the Roman Calculi, and hand-abacus, all of which functioned in a similar way to the modern abacus.
- 260 BCE: The Mayans develop a base-20 system of mathematics, which notably introduces the concept of zero.
- 1500 AD: Leonardo da Vinci invents the mechanical calculator.
- 1605: Francis Bacon creates a cipher that uses A’s and B’s to encode messages, calling it the Baconian Cipher.
- 1613: The word “computer” is used for the very first time, when it was used to describe a person who was really good at performing mathematical calculations or “computations”. That definition would remain unchanged until the end of the 1800s.
- 1614: John Napier introduces the idea of logarithms.
- 1617: John Napier introduces a rudimentary computer-like device called Napier’s Bones where were made from bone, horns, and ivory. This device let the user calculate multiplications by adding numbers. You could also divide numbers by subtracting.
- 1623: The first known mechanical calculating machine is invented by Germany’s Wilhelm Schickard. Schickard’s work was largely influenced by Napier’s Bones, which we just mentioned.
1600s and the Invention of the Slide Rule
Throughout the 1600s, slide rules began to be used for the first time. Slide rules were mechanical analog computers used for multiplication and division. You could also use the slide rule for roots, logarithmic functions, and trigonometry. Unlike modern calculators, you could not use the slide rule for addition or subtraction.
Slide rules were heavily influenced by the work of John Napier, who we mentioned above. However, credit for the invention of the first slide rule goes to the Reverend William Oughtred. Oughtred (and several others during this time period) would develop multiple types of slide rules throughout the 17th century.
The slide rule remained enormously popular until the invention of the pocket calculator. Right up until the 1970s, when scientific calculators were introduced, slide rules were a booming business across America.
1800 and Punch Card Systems
Throughout the 1800s, three major inventors theorized that you could combine punch cards with mechanical processes to create a computer.
These early inventors included French inventor Joseph Marie Jacquard. In 1801, Jacquard invented a loom that used punched wooded cards to automatically weave fabric designs. Over a century later, the world’s first computers would use similar punch cards.
In 1822, Charles Babbage conceived of a steam driven calculator that could compute tables of numbers. Babbage was on the right track. However, despite receiving financing from the British government, Babbage’s computer was never built.
Then, in 1890, inventor Herman Hollerith created a punch card system to process information from the 1880 census. That system processed data in just three years, which saved the government a reported $5 million. You might recognize Hollerith’s name: he later went on to establish the company that eventually became IBM.
1936 and Alan Turing’s Computer
Most people credit Alan Turing with the development of the first computer. In 1936, Alan Turing presented the notion of a “universal machine” called the Turing machine. A few years later, Turing would put that concept into reality. As we all saw in The Imitation Game, Turing’s computer would eventually be used to famously crack the Enigma encryption system during World War II.
The Turing Machine manipulates symbol on a strip of tape according to a table of rules. One of the most useful parts of a Turing machine is that it can be as useful as you want: you can give the machine any computer algorithm and it will simulate that algorithm’s logic.
Alan Turing called his machine an “a-machine”, or automatic machine.
Today, the Turing Machine is viewed as the original multi-purpose computer. Turing is also viewed as the creator of artificial intelligence.
One of the major differences between a Turing Machine and modern computers is that Turing machines do not use random access memory (RAM).
J.V. Atanasoff Creates the First Computer with Memory
While Turing gets most of the credit for creating the first thing that looked like a modern computer, J.V. Atanasoff is credited with building the first computer able to store information on its main memory.
Atanasoff was a professor of physics and mathematics at Iowa State University. In 1937, he attempted to build the world’s first computer without any gears, cams, belts, or shafts.
A few years later, in 1941, Atanasoff was more successful, designing a computer that was capable of simultaneously solving 29 equations. For the first time in history, a computer was able to store information on its main memory.
The Grandfather of Digital Computers, the ENIAC
Between 1943 and 1944, two American professors, John Mauchly and J. Presper Eckert, built the Electronic Numerical Integrator and Calculator (ENIAC).
This pair is credited with inventing the “grandfather of digital computers”. That computer was enormous. It consisted of 18,000 vacuum tubes and filled a 20 foot x 40 foot room.
After receiving funding from the Census Bureau to build a follow-up machine, the pair of professors would leave the University of Pennsylvania. That follow-up machine was called the UNIVAC and would go on to become the first commercial computer for business and government applications.
Computers Become Smaller in 1947
Part of the reason why early computers were so big was the need for vacuums. As mentioned above, the world’s first digital computer consisted of 18,000 vacuum tubes.
The pair who invented that first computer must have felt pretty silly when, in 1947, three researchers from Bell Laboratories invented the transistor. This allowed them to make an electric switch with solid materials and no need for a vacuum. Computers were still enormously large by today’s standards, but this paved the way for smaller and smaller sizes.
The First Computer Language, COBOL, is Created in the Late 1950s
Grace Hopper is credited with inventing the first computer programming language, called COBOL. COBOL stood for Common Business Oriented Language.
Prior to creating COBOL, Grace Hopper had invented a predecessor to that language called FLOW-MATIC. COBOL First appeared in 1959. By 1999, one firm estimated that there were a total of 200 billion lines of COBOL in existence running 80% of all business programs.
1958 and the Invention of the Computer Chip
The world’s first computer chip was invented in 1958, when Jack Kilby and Robert Noyce unveiled something they called the “integrated circuit”. We now know that device as a computer chip. In the year 2000, 40 years after his invention, Kilby received the Nobel Prize in Physics for his work.
1964: The World’s First GUI Makes Computers Accessible to the Public
In 1964, researcher Douglas Engelbart showed off a prototype of the modern computer. This computer is credited as the first one with a “modern” graphical user interface (GUI) as well as mouse support. For the first time, researchers believed computers could be made more accessible to the general public.
Of course, computers were still a long ways away from releasing to the public. But for the first time, people realized that computers could be more than just a scientific and mathematical tool: they could be used by average people.
1969 to 1973: Paving the Way for Modern Computers
Between 1969 and 1973, a number of critical inventions hit the market that would pave the way for the modern computer systems we know today.
In 1969, for example, a team of developers at Bell Labs created an operating system called UNIX. UNIX aimed to “unify” different computer hardware platforms and was usable across multiple devices. It became the operating system of choice for larger companies with mainframes. Due to its slow performance, UNIX never gained major support among PC users at home.
Then, in 1970, Intel (which had just recently launched as a company) unveiled the Intel 1103, which was the first Dynamic Access Memory (DRAM) chip.
One year later, IBM researchers unveiled the portable storage device known as the floppy disk, which allowed data to be shared between computers.
In 1973, Robert Metcalfe, who was a senior researcher at Xerox, enhanced computer-to-computer communication by developing the Ethernet cable. This allowed computers to be connected to one another and to other hardware.
1974: Personal Computers Hit the Market for the Very First Time
Before 1974, computers were those cool devices that you may have read about in newspapers – but could never actually own in your own home.
That all changed in 1974, when a number of personal computers hit the market. Between 1974 and 1977, four major computer systems hit the market and changed the computer hardware industry forever. Those computer systems included:
- Scelbi & Mark-8 Altair
- IBM 5100
- RadioShack TRS-80 (also known as the Trash 80)
- Commodore PET
Follow-ups to these models were also released. 1975 saw the release of the Altair 8080. In one advertisement in Popular Electronics, the Altair 8080 was described as being the “world’s first minicomputer kit to rival commercial models.”
The TRS-80, on the other hand, sold like crazy at RadioShack locations across the country. This computer allowed average people with limited knowledge of coding to write programs and make a computer do what they wanted.
As you’ll learn below, the release of these computer models inspired Paul Allen, Bill Gates, Steve Jobs, and Steve Wozniak to do some pretty special things in the world of computing over the coming decades.
Microsoft is Founded in 1975
After the release of the Altair 8080, two computer geeks named Paul Allen and Bill Gates offered to develop software for the ALTAIR. The two considered themselves to be experts at the new BASIC coding language.
After being successful coders, the two childhood buddies founded their own software company, Microsoft, on April 4, 1975.
Apple is Founded in 1976
Around the same time Microsoft was kicking into action, Apple was a glimmer in the eyes of Steve Jobs and Steve Wozniak. The company was founded in 1976 and, on April 1, 1976, the pair introduced the Apple I, which was the first computer that used a single-circuit board.
By 1977, Jobs and Wozniak had incorporated Apple and released the Apple II. They showed off the Apple II at the first West Coast Computer Faire. Notable improvements with the Apple II included color graphics and an audio cassette drive for easy storage.
The World’s First Computer Applications
In the late 1970s, all of these new programming platforms led to a surge in amateur (and professional) developers creating their own applications.
In 1978, the accounting world was changed forever with the invention of VisiCalc, which was the world’s first computerized spreadsheet program.
Then, in 1979, writing on computers became a lot easier thanks to the world’s first word processing software, WordStar, which was created by MicroPro International.
IBM’s First Personal Computer
IBM teamed up with Microsoft to release its first personal computer in 1981. That computer used the MS-DOS operating system and had an Intel chip inside.
The “Acorn”, as the device was codenamed, was also advanced in terms of hardware at the time. It had two floppy disks and a color monitor. The machines were sold at Sears & Roebuck stores across the country. This was important because it was the first time a computer was sold through third-party retailers instead of directly through the computer company.
IBM’s first personal computer is also notable because it popularized the term “personal computer”. That’s why IBM gets credit for creating the term “PC”.
1983: Apple Releases Its Own PC, Lisa
Today, we associate the term “PC” with Windows computers. However, soon after IBM released its first PC in 1983, Apple released a PC of its own called Lisa.
Lisa was notable for being the first computer with a GUI – including relatively modern features like a drop down menu and icons.
Despite Apple’s best intentions, the Lisa was a flop. Nevertheless, Apple would learn from its lessons with Lisa and eventually release the Mac.
That same year, Apple also released the world’s first laptop, called the Gavilan SC. The Gavilan SC was described as a portable computer and featured the same flip-style form we see in today’s modern laptops.
1985: Microsoft Announces Windows
Microsoft saw what Apple did with its GUI, so it decided to do something better: Microsoft announced Windows.
The first version of Windows, Windows 1.0, was released on November 20, 1985. Contrary to Microsoft’s expectations, Windows 1.0 was not well-received or popular.
Nevertheless, the name “Windows” was an important development. Microsoft had originally planned to call its operating system the extremely uncatchy and unmarketable “Interface Manager”. And obviously, judging by the success of later versions of Windows, Microsoft was unfazed by the failure of Windows 1.0
1985 to 1996: The Foundations of the Internet
Between 1985 and 1996, the internet went from a niche computing service to a global system filled with massive potential.
It started in 1985 when the world’s first dot-com domain name was registered. That first domain name probably wasn’t as exciting as you think: it was Symbolics.com, which was registered by The Symbolic Computer Company from Massachusetts.
In 1990, a CERN researcher named Tim Berners-Lee created HTML, which eventually led to the creation of the World Wide Web.
By 1996, Sergey Brin and Larry Page had developed Google.
1994: The Year PC Gaming Exploded
Computer games were common throughout the 1970s and 1980s in arcades and gaming systems. Starting around 1994, however, the PC began to be viewed as legitimate gaming machine.
PC developers created games like Command & Conquer, for example, along with Descent and Little Big Adventure. In the coming years, we’d see games like Doom, Quake, Age of Empires, Half Life, and many more.
Apple Unveils Mac OS X in 2001
People seem to forget that Apple and Microsoft were once embroiled in a huge legal battle. In 1997, Apple alleged that Microsoft had copied the “look and feel” of its operating system by developing Windows.
Later that year, Microsoft invested $150 million in Apple, which was a struggling company at the time. This ended Apple’s court case against Microsoft.
Then, in 2001, Apple introduced its own operating system, Mac OS X. Later that same year, Microsoft rolled out its own groundbreaking operating system, Windows XP.
Today, tablet computers and mobile devices are changing the future of computing technology. We’ve come a long ways in 70 years – how much further can computers go in another 70 years?