Many of us today depend heavily on the Internet. Whether it's managing our finances, answering correspondences, or just Googling a question, the applications of the web are as endless as they are vital to modern living.
While many of us have been online since the 1990s, the history of the web stretches back far further. The first primitive computing devices were conceived as long ago as the 17th Century, with the earliest concepts for programmable computers emerging in the mid-19th Century.
Here's a brief history of the web, and the scientific minds who contributed to the digital culture we know today.
The 17th Century: Logarithm and the Slide Rule Appear
The history of the web begins in earnest in the early 1600s. The 17th Century saw the introduction of what would prove to be the foundations of modern computing. In 1614 John Napier proposed a new mathematical method, one that provided an analytical scope beyond that of any pre-existing functions. This method was logarithm, and it allowed for rapid computation by counting repeated multiplications of a single factor.
Napier's work on logarithm first appeared in Mirifici Logarithmorum Canonis Descriptio, which became an influential text in the fields of mathematics and engineering, as well as physics and navigation.
Based on Napier's studies, the Slide Rule was first developed by Edmund Gunther. Gunther's Rule was an early analog computer that used the principles of logarithm to multiply and divide. Reverend William Oughtred further expanded upon Gunther's design; combining two Gunther Rules to create what is now commonly regarded as the first recognizable Slide Rule.
Oughtred's Slide Rule designs were published by his student, William Forster, in 1632. From there, many other mathematicians and engineers developed and expanded upon Oughtred's designs, creating Slide Rules capable of calculating trigonometry, roots, and exponents. This early method of computation lay the groundwork for much of today's technology.
The 19th Century: Lovelace, Babbage, and the Analytical Engine Pioneer Computer Programming
The history of the web can't be told without also considering the history of computer programming. While the 17th Century gave us the tools to perform rapid calculations, it wasn't until the 19th Century that proposals for a programmable computer emerged.
In 1822, Charles Babbage completed a small prototype "Difference Engine"; a machine operated by a hand-crank that could compute mathematical tables. Though interest in his invention was great, contemporary metalworking techniques could not adequately produce the necessary parts for the engine and the project was ultimately abandoned.
Undeterred, Babbage joined forces with mathematician Ada Lovelace for his next endeavor - the Analytical Engine. The proposed structure of the Analytical Engine became the precursor for computers in the electronic age: an integrated memory, control flow, and an arithmetic unit.
Lovelace is often credited with writing the first computer program, thanks to her algorithm for the computation of Bernoulli numbers by the Analytical Engine. Sadly, much like the Difference Engine, the device was never completed and Lovelace's program was not tested within her lifetime. Despite this, both Babbage and Lovelace are remembered as pioneers of computing.
1943 - 1969: Electric Age Computers and the First Networks Take Over
The mid-20th Century brought along a host of milestones in the history of the web. In 1943, Tommy Flowers unveiled The Colossus - the world's first electric, programmable computer. Colossus featured a series of vacuum tubes that could perform counting operations and was utilized by the Allies in the Second World War to intercept and decode messages from German High Command.
In 1949, EDSAC (Electronic Delay Storage Automatic Calculator) performed its first calculation, and is today considered one of the first stored program computers. In 1952, it ran the world's first graphical computer game, "OXO", developed by Professor Sandy Douglas.
1949 also saw the birth of the first modems. These early modems transmitted radar signals, and were capable of modulating received digital data into sounds, and demodulating sounds into data. By 1958 modems adapted for use with computers were connected to commercial telephone lines.
The first descriptions of a computer network resembling the world wide web of today appeared in J.C.R. Licklider's "Galactic Network" concept in 1962. Licklider's work paved the way for 1969's ARPANET, which connected computers in Stanford University, UCLA, UCSB, and the University of Utah. This technological experiment is considered by many to be the first iteration of the internet as we know it.
The 1970s: The First Email and the Birth of Ethernet Change Digital Communication
A stand-out moment in web history is the first successful electronic mail. In March 1972, Ray Tomlinson developed the first email software; a simple send and read program to aid communication between ARPANET developers.
By July, Tomlinson expanded the scope of the program's abilities, adding options to file, forward, and respond to messages. These were the humble beginnings of what would come to be one of the most popular communication methods of the late 20th and early 21st Centuries.
Early examples of Ethernet worked at a mere 2.94 Mbps. By 1979 a standard Ethernet rate of 10 Mbps was agreed upon by Xerox, Intel, and the Digital Equipment Corporation. Though by today's standards this might not seem like much, this standardization represented a landmark in digital communication.
The decision to standardize represented the move towards creating an accessible, commercial Internet that would eventually be made available to the public.
The 1980s: The First Personal Computers and the World Wide Web Lead the Way
The 1980s mark a point in the web's history when the Internet began to move out of research facilities and into people's homes. Though personal computers were available as kits as early as the late 1970s, it wasn't until the 1980s that personal computers resembling the ones we use today hit the market.
It's estimated that in 1980 there were a total of one million personal computers in the US, owed in large part to new products like the Commodore VIC-20.
In 1985, the first version of Microsoft Windows became available to the public, revolutionizing the software industry and changing the way people interacted with computers forever.
In 1989, while working at CERN, Sir Tim Berners-Lee created the first Internet browser - the World Wide Web. During this time he also began work on the fundamental technologies of the world wide web - HTML, URI, and HTTP - along with the first web server.
Thanks to his decision to make his codes available royalty-free in perpetuity, his work provided the basis for the what would come to be known as Web 1.0; the first wave of the Internet as we know it.
1990 - 2004: Web 1.0 and the Rise of Social Media Usher in a New Age
At the close of 1990, the first web page was posted on the open Internet - an event in web history that marks the beginning of Web 1.0. Early websites during this time were static affairs which users could not interact with, which were also connected by a series of hyperlinks.
Though the term "Web 2.0" was first used by Darcy DiNucci in 1999, it is commonly regarded that this second wave of the Internet did not truly begin until 2004. In her article "Fragmented Future", DiNucci prophesized the increased interactivity of the Internet of today, as well as the development of online-enabled handheld devices.
In February of 2004, Mark Zuckerberg launched Facebook from his Harvard dorm room. Within a month, almost half of Harvard's undergraduate population was registered on the site.
Facebook, like many other social networks of its time, was indicative of the interactive trend in Web 2.0 which allowed users to comment, like, and tag each other in posts. While these additions might seem superficial, they would become hugely important sources of data for industries around the world.
The Internet of Today and How It's Changing
Our online culture today marks a culmination of over 400 years of web and computer history. Thanks to web technologies, we're on the cusp of what could become a new industrial revolution. Remote work, facilitated by faster internet speeds, is quickly changing the face of the labor market. Surgeons like Mehran Anvari can even perform operations remotely, working as hundreds of miles away from their patients.
While the Internet has become a vital tool in today's world, it's one we take for granted at our risk.
Recent proposals such as the eradication of Net Neutrality critically threaten our online access and freedom. Given the long history of the web, and the legacy of the countless great minds who contributed to the technology at our disposal today, it would be remiss to allow the progression of the Internet to be hindered.