網路城邦
回本城市首頁 時事論壇
市長:胡卜凱  副市長:
加入本城市推薦本城市加入我的最愛訂閱最新文章
udn城市政治社會政治時事【時事論壇】城市/討論區/
討論區知識和議題 字體:
上一個討論主題 回文章列表 下一個討論主題
網際網路的起源和演化 --- Kelly Damon
 瀏覽530|回應0推薦1

胡卜凱
等級:8
留言加入好友
文章推薦人 (1)

胡卜凱

今天是電腦跟電腦第一次連線54周年轉登這篇文章紀念


The History of the Internet – How the Web Started and Evolved

Kelly Damon, 10/26/23

When you think about it, the internet is a ridiculous concept. Little machines transmit electromagnetic waves to and fro so we can communicate long distance, stream entertainment, or send cat memes. Imagine trying to explain it to someone from medieval times!

The internet isn’t something we pay heed to, as long as it does what we need it to do. We don’t typically stop to marvel at it, because it’s become a staple — like denim, or cars, or video games.

How, then, did it become such an integral part of society?

Come with me on a journey through time to discover the unlikely origins of the world wide web. Learn what led to it, how it began, and how it evolved into something few of us can live without.

First, Let’s Talk About Computers

Computers are central to almost every aspect of digital technology, so it’s difficult to imagine a time when they didn’t exist. It almost goes without saying, but they’re a relatively young invention, and didn’t exist for most of human history. 

On the Cosmic Calendar — a timeline of the universe depicted as a calendar year — modern computers would have come to be in the last second, of the last minute, of December 31st.

In the grander scheme of things, the internet was invented yesterday.

Ancient History

This isn’t to say computers fell out of the sky after a single moment of inspiration. They were hypothesized long before they existed, thanks to mathematics. Although the origin of math is unknown, it’s widely believed primitive astronomers were the first to use it, to track lunar cycles and other celestial events.

From there, counting systems were an inevitability, and as humanity evolved and progressed, tools to help us along were designed — namely the abacus, created around 2700 BCE in Mesopotamia. This gave rise to a number of similar mechanical counting devices over the course of ancient history, and in turn led to the first iterations of computing, like the 
Antikythera Mechanism.

Discovered in a shipwreck off the coast of Greece, it’s thought to be the oldest known example of an analog computer. Archeologists believe ancient Greek scientists used it to predict astronomical happenings, and track the planets and the four-year Olympic cycle. The relic used dials and gears, and seems to be the first of its kind.

The word “computer” originally referenced people whose job it was to carry out complex calculations. According to the Oxford Dictionary, the first recorded mention of computers was in 1614, in a book called The Yong Mans Gleanings by Richard Brathwait, referencing these mathematicians. Most of them were women, because they were paid less than men. 

Lord Byron’s Link to Modern Computers

The first viable mechanical computers were designed by Charles Babbage in the 19th century, but they never saw the light of day. In 1822, he invented the difference engine — a mechanical calculator. At first, the British Government was interested in funding his project, but later, upon realizing it was more trouble than it was worth, they abandoned it.

A decade later, he improved his design and conceptualized the Analytical Engine. Once again the British Government helped Babbage bring it to fruition, but left him high and dry when it became too expensive to complete. 

The Analytical Engine is largely considered the first programmable computer, as it was Turing-complete by today’s standards. Charles Babbage is regarded as the Father of Computers by some. Others argue he had a good idea, but it wouldn’t have worked without another key player.

Ada Lovelace, daughter of Lord Byron, is best known for her contributions to the Analytical Engine. Originally hired to simply translate Babbage’s teachings into English, she realized the Engine had capabilities far beyond simple calculations. She designed the first computer algorithm around it, and is now said to be the world’s first computer programmer.

Who do you think deserves credit for the computer — the man who thought of it, or the woman who made it work?

Humble Beginnings

Over the next century, computers saw unprecedented development, including the creation of the first analog computer, which was limited because it couldn’t be programmed and lacked versatility. Still, it was sophisticated enough to solve equations and keep time. 

The modern computer is credited to Alan Turing, who in 1932, proved computers were capable of calculating anything if they were programmed to do so. From this, the concept of the Universal Turing Machine (as it’s now known) was born, which remains the standard for computers to this day.

From Turing’s evidence, electro-mechanical stored-program computers took form, which in turn catalyzed the dawn of general purpose digital computers. The first of these was 
ENIAC (Electronic Numerical Integrator and Computer). It was designed for the US army in secret at the University of Pennsylvania, and first saw the light of day on February 15th, 1945. For the next few years, computers would be isolated to military and academic uses.

To think a computer this size didn’t even run well.

The Original Internet

One problem with the first computers was there were too few of them, and too many people reliant on them. By 1959, researchers at different institutions began proposing methods for multitasking. Not four years later, personnel at MIT developed the first working time-sharing system, called the Compatible Time-Sharing System (CTSS). 

This was a huge step forward in information technology, as it allowed users to connect to the same mainframe from different machines. Still, computing hadn’t been perfected yet, as the industry had another gap waiting to be filled.

It’s All Thanks to the US Army

Now that users could multitask on one mainframe, the US Defence force aimed a little higher. What if they could get computers to “talk” to one another? See, during the cold war, military personnel needed a way to communicate classified information without placing a target on one specific control center. 

The US military then funded the development of the Advanced Research Project Agency Network (ARPANET) as a means to connect all Pentagon-funded research facilities to each other. 

Computers spoke for the first time in 1965, when a man named Lawrence Roberts successfully connected two computers through a telephone line and transferred data packets between them. 

Then, on October 29th, 1969, Leonard Kleinrock sent the first online message from a computer at UCLA to one at Stanford. According to legend, it was supposed to say “login,” but the receiver at Stanford crashed, and so the first message sent over a network was “Lo.” This historic moment is known as the true birth of the internet, and is the reason we celebrate Internet Day on its anniversary each year.

After much trial and error, ARPANET was successfully launched in 1969. It marked the first successful computer network in history, and set the stage for the internet as we know it.


How Global Networking Came to Be

1970 saw a new era too — the Information Age. Sound familiar? It should, because you’re still in it. This was when computing saw a huge boom and began to rapidly progress. 

By 1971, email had been introduced as part of ARPANET’s communications, but a more exciting event was the expansion of ARPANET to other countries. This called for various global networks to connect to each other.

To realize this, two computer scientists named Bob Kahn and Vint Cerf developed the Transmission-Control Protocol/Internet Protocol (TCP/IP) as a method for unified communication. ARPANET adopted it on January 1st, 1983 — what’s believed to be the internet’s true birthday.

Under TCP, data is broken down into a sequence of smaller packets contained in a datagram, and reconstituted once it gets to its destination. IPs are how computers recognize each other, so they know where to send data. Kahn and Cerf aptly called their “network of networks” the Internetwork — or internet for short.

The Implementation of TCP/IP drastically changed how data was shared across networks — and most networks still use it today.

The Rise of the PC

The American Computer Museum, Computer Museum of America, and the Computer History museum agree the 
Kenbak-1 was the first personal computer, or PC. It was invented by a man named John Blankenbaker in 1970, and it was meant to teach people the basics of computing. It didn’t sell well at all, likely because civilians had no need for it.

The first PC looks like my home’s electrical mains.

In 1974, the Altair 8800 became the first PC to make its way into homes, largely due to its feature on the cover of Popular Electronics, which attracted the attention of hobbyists. Even so, most people had little interest in computers, and although the Altair 8800 was a commercial success, it wasn’t exactly a hit.

Computers took over American homes in 1974, when Steven Wozniak and Steve Jobs (yes, that Steve Jobs) saw a gap in the market, and designed the first PCs with universal appeal, the 
Apple II. They were sold as ready-to-use machines rather than hobbyist kits, had color graphics, could play sound, and they had a variety of uses, including gaming.

This was, perhaps, the most important step in how we use computers today. Even though the government still gatekept the internet at the time, the release of the Apple II catalyzed the commercialization of personal computers. It was inevitable that the internet would soon follow.

And Then, There Were ISPs

Throughout the 1980s, a few early providers supplied basic online services, like email, to customers. It was all they could do, since the internet was limited, and civilians didn’t have full access to it.

Then, on March 12th, 1989, Tim Berners-Lee, a British computer scientist stationed at 
CERN, transformed the internet into what we know today by inventing the most important thing in modern history — the world wide web

This led to the invention of web browsers, too. Although WorldWideWeb and Erwise were the first to be released, MOSAIC was the first graphical browser, which made room for text and images in the same window.

Coincidentally, the first modern internet service providers showed up in the US and Australia by November 1989 — just in time for the 90s.

How the 90s Flipped the Internet on Its Head

CERN also created the 
first website. It was an incredibly simple site that was dedicated to the World Wide Web project. As if their list of credits wasn’t long enough, they did something that changed the world forever: they made the internet public in April, 1993.

After decades of a limited web, designed for armies and academics, this decision meant two very important things.

1. No one can own the internet
2. Everyone can contribute to it

The rest, as they say, is history., because the internet was able to go from this:
(Picture)
Follow the white rabbit

To this:
(Picture)

The Space Jam website is still up and untouched.

Now, there are more than 
a billion websites online, not counting individual user pages, profiles and channels. 

The 1990s also saw the first online games, and by association, the first MMORPGs. The first of its kind was Neverwinter Nights, released in 1997.

The Giant Arrives

Have you ever wondered how people used the internet before search engines? Back when the internet was still in its early days, users had access to directories — but they were unintuitive, slow, and a hassle to use. 

After the world wide web surfaced, and more websites popped up, many attempts at more sophisticated directories failed. They were too simple, and most only listed paid submissions.

This all changed with WebCrawler, the first public search engine able to find any word on any page and yield results based on user searches. It was widely used, but lost favor to Yahoo! Directory In 1995.

Yahoo! Directory changed the game because it was the first to 
index its search results, rather than display full text from websites. This simple convenience skyrocketed its popularity — and set the standard for all search engines — but it also created competition.

Ultimately, Yahoo!’s greatest innovation would go on to become its worst enemy — for one simple fact. It led to the creation of one of the largest and most dominant entities in the world: Google.

This Is Google’s World, and We Just Live In It

So far we’ve covered the development of computers, the first networks, the true origins of the internet, and CERN’s contributions to the world. None of it is as mind-blowing as Google.  

See, Google was invented by two Stanford students, Larry Page and Sergey Brin, as a research project in 1996. While all other search engines at the time, Yahoo! included, yielded results based on hits, Google implemented an algorithm named PageRank, which pulled results based on hierarchical relevance.

This formula was so successful, the word “google” is now synonymous with web searches. It also meant websites could no longer rely on old school tricks (like naming their company “A”) to appear first. Now websites had to rank, and the only way they could was to emphasize content

Google was so good at finding what people were looking for, the world decided nothing mattered beyond the first results page. This holds so true that today, 
fewer than 1% of people bother visiting page 2 at all. 

Google is also the most visited website and holds 99% of search engine market share. To add insult to its competitors’ injuries, it also owns the second most visited website, 
YouTube.

Everyone’s Internet

Google’s PageRank did more than change how websites work. It changed how users interact with the web. 

Websites don’t stand a chance if they’re not search-engine optimized. Advertisers now spy on your online habits so they know where to force feed you ads. Algorithms learn your behavior to give you a more streamlined experience, and fake news, clickbait, and spam are out-of-control problems we’ll likely never see an end to.

It would be easy to blame Google for everything, but we’re missing one more important player: social media. Once the 90s set the internet in stone — and any ordinary Joe or Jane could have a piece of it — it was only a matter of time before everyone joined the fray.

Social media has 
innocent origins. From Classmates.com to MySpace and the early versions of Facebook and YouTube, social networking was once nothing more than a space for people to keep in touch. Then, in 2007, Facebook started running ads, and it all went downhill from there.

Now, the internet is what we’re used to, a place where people are both the performers and the audience. Smart phones have made it possible for us to be connected 24/7. So long as we obey Google, anyone at all can grow a platform, have influence, or be seen, heard, and supported.

Where Do We Go from Here?

Of course, some innovations also rewrote how the internet works — like the move from television to streaming, or video games swinging violently to online multiplayer — but each of these circles back around to demand. We live most of our lives online now, so naturally entertainment (and business) followed us there.

If the history of the internet teaches us anything, it’s that the web is unpredictable and we can’t guess what the future might hold. Ancient astronomers couldn’t fathom the Analytical Machine, and Charles Babbage had no concept of smartphones or live streaming

In the 90s, Alexa was a thing of sci fi, and in the 2010s, no one knew ChatGPT was coming.  No one can say what’s next, but theories abound — from chips inserted into our brains, to AI taking over once and for all. 

One thing’s for sure: the internet isn’t going anywhere. What a time to be alive.

Author 

When Kelly was 10 years old, she took the family computer apart to try and figure out how it worked. It's no wonder that she now writes tech blogs for CyberGhost.


本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘

引用
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=7215897