Blum 2012

From Whiki
Jump to navigation Jump to search

Blum, Andrew. Tubes: A Journey to the Center of the Internet. Ecco, 2012.


Highlight(yellow) - Page 8 · Location 131

To stitch together two halves of a broken world—to put the physical and the virtual back in the same place—I’ve stopped looking at web “sites” and “addresses” and instead sought out real sites and addresses, and the humming machines they house.

Highlight(yellow) - Page 9 · Location 144

For all the breathless talk of the supreme placelessness of our new digital age, when you pull back the curtain, the networks of the Internet are as fixed in real, physical places as any railroad or telephone system ever was.

1 / The Map

Highlight(yellow) - Page 14 · Location 193

a big annual report known as Global Internet Geography, or GIG, sold to the telecommunications industry for $ 5,495 a pop.

Highlight(yellow) - Page 16 · Location 218

Each line represented a single cable, mere inches in diameter but thousands of miles in length. If you lifted one up from the ocean floor and sliced it crosswise, you’d find a hard plastic jacket surrounding an inner core of steel-encased strands of glass, each the width of a human hair and glowing faintly with red light.

Highlight(yellow) - Page 20 · Location 265

the networks that compose the Internet could be imagined as existing in three overlapping realms: logically, meaning the magical and (for most of us) opaque way the electronic signals travel; physically, meaning the machines and wires those signals run through; and geographically, meaning the places those signals reach.

2 / A Network of Networks

Highlight(yellow) - Page 41 · Location 541

Kleinrock is the father of the Internet—or rather, a father, as success has many. In 1961, while a graduate student at MIT, he published the first paper on “packet

Highlight(yellow) - Page 41 · Location 542

switching,” the idea that data could be transmitted efficiently in small chunks rather than a continuous stream—one of the key notions behind the Internet.

Highlight(yellow) - Page 42 · Location 552

scientist named Larry Roberts—Kleinrock’s MIT office mate—was recruited to ARPA specifically to develop an experimental nationwide computer network. The next July, he sent out a detailed request for proposals to 140 different technology companies to build what he at first called the “ARPA net.” It would begin at four universities, all in the west: UCLA, Stanford Research Institute, the University of Utah, and the University of California–Santa Barbara.

Highlight(yellow) - Page 53 · Location 691

That remained the case until New Year’s Day 1983 when, in a transition years in the planning, all the host computers on the ARPANET adopted the electronic rules that remain the basic building block of the Internet. In technical terms, they switched their communications protocol, or language, from NCP, or “Network Control Protocol,” to TCP/ IP, or “Transmission Control Protocol/ Internet Protocol.”

Highlight(yellow) - Page 53 · Location 696

once the dust had settled several months later, the result was the computing equivalent of a single international language. TCP/ IP went from a dominant dialect to an official lingua franca.

Highlight(yellow) - Page 54 · Location 702

The New Year’s 1983 standardization of TCP/ IP permanently fixed the Internet’s distributed structure, ensuring to this day its lack of central control. Each network acts independently, or “autonomously,” because TCP/ IP gives it the vocabulary to interact.

Highlight(yellow) - Page 55 · Location 712

The Internet’s geography and shape weren’t drawn up in some central AT& T engineering office—as the telephone system was—but rather arose out of the independent actions of first hundreds, and later thousands, of networks.

Highlight(yellow) - Page 64 · Location 839

In a clear departure from its original roots, the Internet was no longer structured as a mesh, but rather was entirely dependent on a handful of centers.

Highlight(yellow) - Page 64 · Location 840

As the urban theorist Anthony Townsend has pointed out, “The

Highlight(yellow) - Page 65 · Location 840

reengineering of the Internet’s topology that was implemented in 1995 was the culmination of a long-term trend away from the idealized distributed network … envisioned in the 1960s.” As the number of networks increased, their autonomy was best served by centralized meeting points.

3 / Only Connect

Highlight(yellow) - Page 75 · Location 950

Reid had the simple idea that networks should connect directly, literally plugging one router into another, rather than all plugging into a single shared machine as at MAE-East and the other network access points.

Highlight(yellow) - Page 76 · Location 959

Digital put in a few million dollars of internal funding and a spare bit of office space: the basement of 529 Bryant Street, constructed in the 1920s as a telephone switching office.

Highlight(yellow) - Page 80 · Location 1015

Those connections are always physical and social, made of wires and relationships. They depend on the human network of network engineers.

Highlight(yellow) - Page 81 · Location 1032

In the PAIX’s early days, “cable management” was a crucial technical challenge. The Internet was tangled.

Highlight(yellow) - Page 83 · Location 1053

In another cage was the onetime home of Danni’s Hard Drive, a prominent early pornography site—online home of Danni Ashe, who Guinness World Records once

Highlight(yellow) - Page 83 · Location 1054

named the “Most Downloaded Woman” (a category they no longer track). One night in the late ’90s, Danni herself was purportedly discovered here in the basement, naked with her eponymous hard drive, in the midst of taking the “photo of the week.” The old-timers nodded at the memory, but later I’d hear the same legend repeated at other big Internet buildings, and when I eventually tracked down Ashe and her network engineer at the time, Anne Petrie, they placed the event not in Palo Alto but at MAE-West, the Silicon Valley cousin of MAE-East.

4 / The Whole Internet

Highlight(yellow) - Page 125 · Location 1590

Facebook and Google. In recent years, both have put enormous resources into building out their global networks, in general not by laying new fiber-optic cables (although Google did partner on the construction of a new cable under the Pacific) but by leasing significant amounts of bandwidth within existing cables or buying individual fibers outright. In that sense, a network like Google’s or Facebook’s will be logically independent on a global scale: they each have their own private pathways,

Highlight(yellow) - Page 126 · Location 1593

traveling within the existing physical pipes. The crucial advantage of this is that they can store their data anywhere they choose—primarily in Oregon and North Carolina, in Facebook’s case—and use their own networks to move it around freely on these private pathways parallel to the public Internet.

5 / Cities of Light

Highlight(yellow) - Page 158 · Location 1995

the basic building blocks of the Internet. They scaled: the twenty-dollar box I bought at Radio Shack was a kind of router, and so was Leonard Kleinrock’s original IMP. They were and are the Internet’s first physical pieces.

Highlight(yellow) - Page 165 · Location 2085

Hugh O’Kane Electric Company was founded in 1946 to maintain printing presses for publishers, but

Highlight(yellow) - Page 165 · Location 2086

it had since evolved to become New York’s dominant independent fiber-optic contractor.

Highlight(yellow) - Page 165 · Location 2092

Since 1891 ECS—now a wholly owned subsidiary of Verizon—had owned the franchise to build and maintain an underground system of conduits, which it offered for lease at published rates that haven’t changed in a quarter century: a four-inch-diameter conduit will cost you $ 0.0924 per foot per month, while a two-inch one can be had for only $ 0.0578 a foot.

Highlight(yellow) - Page 172 · Location 2181

“When business is in full swing today 1,500 operators who have been working the sounding keys at 195 Broadway will be enjoying the conveniences of the most modern telegraph plant in the world,” crowed the Times. By 1919, the building was among the nation’s largest long-distance telephone central offices, with 1,470 switchboard test positions, 2,200 long-distance lines, and a transatlantic radio-telephone switchboard—all of which still wasn’t sufficient to serve the country’s telecommunications needs. Today, the building is one of the key pieces of

Highlight(yellow) - Page 172 · Location 2185

New York’s Internet—even if AT& T and Western Union’s cohabitation didn’t last.

Highlight(yellow) - Page 173 · Location 2188

Undeterred by the stock market crash, the telecommunications rivals built matching art deco palaces, each with gymnasium, library, training school, even dormitories. The key to their separation lay beneath Church Street: an extensive run of clay conduits, filled

Highlight(yellow) - Page 173 · Location 2190

with heavy-gauge copper wires that carried messages between the two systems—a sort of proto-Internet that would one day serve the real Internet.

Highlight(yellow) - Page 173 · Location 2200

the paradox of the Internet again: the elimination of distance only happens if the networks are in the same place.

6 / The Longest Tubes

Highlight(yellow) - Page 194 · Location 2441

On a daily basis it may feel as if the Internet has changed our sense of the world; but undersea cables showed how that

Highlight(yellow) - Page 194 · Location 2442

new geography was traced entirely upon the outlines of the old.

Highlight(yellow) - Page 213 · Location 2680

“dense wavelength division multiplexing.” It allowed many wavelengths, or colors, of light to pass simultaneously through a single fiber. Each strand of fiber can be “filled up” with dozens of waves—each of which carries ten, twenty, or even forty gigabits per second of data. One of Paling’s jobs was to tune the lasers to fit in more wavelengths, like a harmonizing chord, getting each one right so they all work well together.

7 / Where Data Sleeps

Highlight(yellow) - Page 240 · Location 3015

“content delivery networks,” which keep copies of frequently accessed data, like popular YouTube clips or TV shows, in many small servers closer to people’s homes, just as a local store keeps popular items in stock.