CS 101 › Lesson 2 of 8

History of Computing

Lesson 2 · OKSTEM College · Associate of Science in Computer Science

History of Computing

Computing did not begin with Silicon Valley. Its roots stretch back centuries through mathematics, mechanical engineering, and wartime necessity. Click any milestone below to read the full story.

1840s Babbage's Difference Engine Mechanical Era

What it was: A massive mechanical calculator designed by British mathematician Charles Babbage to automatically compute mathematical tables — logarithms, trigonometry tables — that were needed for navigation and engineering but were full of human transcription errors.

Why it matters: Babbage never finished building it (it required ~25,000 precision parts with 1840s manufacturing), but the design was sound. When the London Science Museum built it from his original blueprints in 1991, it worked perfectly. He later designed the Analytical Engine — a more general machine with a separate "store" (memory) and "mill" (processor), the conceptual blueprint for all future computers.

Legacy: Proved that a machine could do the work of human calculators and introduced the concepts of stored programs and conditional branching — 100 years before they were actually built.

Charles Babbage — inventor, mathematician
1843 Ada Lovelace's Algorithm First Programmer

What she did: Ada Lovelace (daughter of poet Lord Byron) was asked to translate an Italian article about Babbage's Analytical Engine. She added her own notes — three times longer than the original article — including a detailed step-by-step procedure for computing Bernoulli numbers on the Analytical Engine.

Why it matters: That procedure is recognized as the first published algorithm intended for a machine to execute. She also wrote presciently that the Engine could manipulate symbols according to rules, not just numbers — meaning it could compose music or process language, not only calculate. She described the concept of a loop (repeating a set of operations) over a century before the first real computer was built.

Legacy: The U.S. Department of Defense named its Ada programming language after her. She is widely recognized as the world's first computer programmer.

Ada Lovelace — mathematician, writer
1936 Turing Machine (Theory) Theoretical CS

What it was: In his paper "On Computable Numbers," Alan Turing described an imaginary machine: an infinite tape of symbols, a read/write head, and a finite set of rules. He used this thought experiment to ask: what can be computed at all?

Why it matters: Turing proved that some problems are undecidable — no algorithm can ever solve them, no matter how powerful the hardware. The most famous is the halting problem: there is no general algorithm that can determine whether an arbitrary program will finish running or loop forever. This is not a limitation of today's computers — it's a fundamental mathematical truth about computation itself.

The Universal Turing Machine: He also described a machine that could simulate any other Turing Machine — essentially inventing the concept of a general-purpose, programmable computer before any physical one existed.

Legacy: Every computer ever built is a physical approximation of a Universal Turing Machine. The Church-Turing thesis states that anything computable can be computed by a Turing Machine — the theoretical foundation of all of computer science.

Alan Turing — mathematician, logician, codebreaker
1945 ENIAC First Electronic Computer

What it was: The Electronic Numerical Integrator and Computer, built at the University of Pennsylvania for the U.S. Army. It weighed 30 tons, occupied 1,800 square feet, contained 18,000 vacuum tubes, and consumed 150 kilowatts of power — enough to dim the lights in an entire Philadelphia neighborhood when it turned on.

Why it matters: It was the first fully electronic, general-purpose, programmable computer. Previous machines (like the Colossus used for WWII codebreaking) were special-purpose. ENIAC could be reprogrammed — though "programming" meant physically rewiring cables and flipping 6,000 switches, which took days.

Performance: It could perform 5,000 additions per second — roughly the speed of a pocket calculator today. But compared to human "computers" (people who did math by hand), it was thousands of times faster.

What came next: John von Neumann proposed storing the program in memory alongside data — the stored-program architecture that every computer since has used. This insight, now called the von Neumann architecture, is still the standard model.

J. Presper Eckert — co-designer
John Mauchly — co-designer
John von Neumann — stored-program architecture
1947 Transistor Invented Semiconductor Era

What it was: Scientists at Bell Labs invented the transistor — a tiny semiconductor device that acts as an electronic switch or amplifier. A transistor can be "on" (conducting current) or "off" (blocking current), directly representing the binary 1 and 0 of digital logic.

Why it matters: Vacuum tubes (what ENIAC used) were large, fragile, expensive, generated enormous heat, and burned out constantly. The transistor was smaller, cooler, faster, and far more reliable. It made miniaturization possible.

The physics: A transistor is made from semiconductor material (usually silicon). Applying voltage to the "gate" controls whether current flows between "source" and "drain." This is the fundamental switch — billions of these switches, toggling billions of times per second, are what make your code run.

Legacy: The transistor is arguably the most important invention of the 20th century. The inventors — Shockley, Bardeen, and Brattain — won the 1956 Nobel Prize in Physics.

William Shockley — co-inventor
John Bardeen — co-inventor
Walter Brattain — co-inventor
1958 Integrated Circuit Chip Era Begins

What it was: Jack Kilby at Texas Instruments (and independently Robert Noyce at Fairchild Semiconductor) etched multiple transistors and their connecting wires onto a single piece of semiconductor — the first integrated circuit (IC), or "chip."

Why it matters: Before the IC, building a computer meant soldering thousands of individual components together by hand. The IC allowed all of that circuitry to be mass-produced photographically on a fingernail-sized piece of silicon. Suddenly computers could be smaller, cheaper, faster, and more reliable all at once.

The scale then vs. now: Kilby's first IC had a single transistor. Today a chip the size of your thumbnail contains over 100 billion transistors — each one smaller than a virus.

Jack Kilby — TI, Nobel Prize 2000
Robert Noyce — Fairchild / Intel co-founder
1971 Intel 4004 — First Microprocessor CPU on a Chip

What it was: Intel released the 4004 — a complete CPU on a single chip, containing 2,300 transistors. It was originally designed for a Japanese calculator company (Busicom) but Intel retained rights to sell it for general use.

Why it matters: Before the microprocessor, a CPU filled an entire cabinet of circuit boards. The 4004 put all of that on a chip the size of a fingernail. This made personal computers possible — a computer no longer needed a whole room or even a whole desk.

Specs in perspective: The 4004 ran at 740 kHz (740,000 cycles/second) and processed 4 bits at a time. Your phone runs at ~3 GHz (3,000,000,000 cycles/second) and processes 64 bits at a time — roughly 1 million times faster. Yet both are built on the same fundamental principles.

Federico Faggin — lead designer
Ted Hoff — architecture concept
Stan Mazor — instruction set
1983 TCP/IP Standardized Internet Foundation

What it was: On January 1, 1983 — called "Flag Day" — all computers on ARPANET (the U.S. government research network) switched to use TCP/IP as their communication protocol. This was the moment the modern Internet was born.

Why it matters: Before TCP/IP, different networks spoke incompatible "languages." TCP/IP created a universal language: TCP (Transmission Control Protocol) ensures reliable, ordered data delivery; IP (Internet Protocol) routes packets across any number of networks to reach their destination. Any device, anywhere, running TCP/IP could now communicate with any other.

The key insight: Rather than designing one massive network, Vint Cerf and Bob Kahn designed a protocol that could connect many different networks together — a "network of networks" (hence inter-net).

Legacy: Every web page you visit, every email you send, every video you stream runs over TCP/IP. You'll study it in depth in CS 241: Computer Networks.

Vint Cerf — "Father of the Internet"
Bob Kahn — co-designer of TCP/IP
1991 World Wide Web Web Era

What it was: Tim Berners-Lee, a physicist at CERN (the European particle physics lab), invented the World Wide Web — a system of hyperlinked documents accessible over the Internet using a browser. The first website went live on a NeXT computer in Berners-Lee's office.

Why it matters: The Internet already existed in 1991, but it was mostly used by researchers to transfer files and send email. The Web added three key ideas: HTML (a language for formatting documents), HTTP (a protocol for requesting them), and URLs (addresses for finding them). Suddenly anyone could publish information that anyone else could read with a click.

What he gave away: Berners-Lee made the Web royalty-free. He deliberately did not patent it or charge for its use. His decision to release it as an open standard is arguably one of the most consequential acts of generosity in technological history.

Tim Berners-Lee — inventor of the Web
Robert Cailliau — co-developer
2007 iPhone & the Smartphone Era Mobile Computing

What it was: On January 9, 2007, Steve Jobs introduced the iPhone — a touchscreen computer in your pocket that combined a phone, music player, and internet browser. Within two years, the App Store launched and third-party developers began building software for it.

Why it matters: The iPhone didn't invent any one thing — touchscreens, mobile internet, and GPS all existed. What it did was combine them into a seamless experience and put them in the hands of hundreds of millions of non-technical people. Smartphones became the primary computing device for most of humanity.

The scale today: There are more smartphones on Earth than people. More code runs on mobile devices than on desktops. The skills you're building in this program are in highest demand for mobile and web applications.

Steve Jobs — Apple CEO, product vision
Scott Forstall — iOS software lead
2017+ Deep Learning & Large Language Models AI Era

What it is: Deep learning is a type of machine learning where layered neural networks learn patterns directly from data — images, text, audio — without being explicitly programmed with rules. Large Language Models (LLMs) like GPT-4 and Claude are trained on vast amounts of text to predict and generate human-like language.

Why 2017? The 2017 paper "Attention Is All You Need" (Google Brain) introduced the Transformer architecture, which made it practical to train extremely large models on enormous datasets using GPUs. This is the architecture behind every modern LLM.

Why it matters: AI systems can now write code, pass bar exams, generate images from descriptions, translate languages, and assist with medical diagnoses — tasks previously thought to require human intelligence. This is creating massive demand for people who understand how these systems work, their limitations, and how to use them responsibly.

The open questions: We still don't fully understand why these models work so well, what they're actually "thinking," or where their fundamental limits are. This makes it one of the most exciting (and important) fields in CS today.

Geoffrey Hinton — "Godfather of Deep Learning"
Yann LeCun — convolutional networks
Yoshua Bengio — deep learning pioneer

Moore's Law

In 1965 Gordon Moore observed that the number of transistors on a chip doubles roughly every two years while cost halves. This exponential trend held for ~50 years, driving modern computing.

Today: A modern Apple M-series chip contains over 100 billion transistors. ENIAC had 18,000 vacuum tubes and filled an entire room.

Turing's Legacy

Alan Turing's 1936 paper defined the theoretical limits of computation — what can and cannot be computed, regardless of hardware. His concepts of the Universal Turing Machine and the halting problem underpin all of theoretical computer science.

Lab — Moore's Law Visualizer

Knowledge Check

Who wrote the first published algorithm for a computing machine?

Moore's Law states that transistor count roughly doubles every

The Turing Machine is significant because

TCP/IP standardization in 1983 was the foundation of

ENIAC is historically significant as

← PreviousNext →