Fibre optic wires, servers, and more than 550,000 miles of underwater cables: Here's what the internet actually looks like

Flickr/Official U.S. Navy PageDivers remove corroded zinc anodes from an undersea cable near Hawaii.

Every second, millions of emails, clicks, and searches happen via the world wide web with such fluidity that the internet seems almost omnipresent. As such, people often mistakenly assume that internet traffic happens by air – our mobile devices, after all, aren’t wired to anything.

But satellites carry less than 1% of human interactions, and in some ways the truth is far more impressive than messages sent by tower signal.

The internet – arguably the most important resource in the modern world – is very tangible and fairly vulnerable. It exists in large part under our feet, by way of an intricate system of rope-thin underwater and underground cables hooked to giant data storage units so powerful, they’re capable of recalling any piece of information at a moment’s notice.

Here’s what the infrastructure of the internet actually looks like today:

In the most basic sense, the internet’s job is to carry information from point A to point B.

Those points are IP addresses – the unique codes that identify locations around the world – and they’re what your devices are linked to when you’re connected to the internet. Curious what yours is? If you type “My IP address” into Google, the search engine will bring it up.

As it travels, any information transferred over the web arrives at internet data servers, which live in data centres around the world. In 2008, an estimated 9.5 trillion gigabytes passed in and out of the world’s servers —  but more on those later.

Data Center Maps powered by Google MapsThe Data Center Map website uses Google Maps to pinpoint all of the data servers around the world.

Moving information to and from servers often involves crossing oceans. We rely almost entirely on cables for internet traffic because they’re faster and cheaper than satellites, but laying them across bodies of water is a tedious process that’s taken almost 200 years and requires a lot of maintenance.

David GreerAT&T manhole cover, San Luis Obispo CA

To get the internet to what it is today, humans have slowly laid over 300 underwater cables that run a total of 550,000 miles.

About 97% of all intercontinental data is transferred through these cables, according to the Asia-Pacific Economic Cooperation forum.

If the world’s underwater cables were laid out end-to-end, the cables could extend from here to the moon and back again, and then wrap around the earth’s widest point almost three times.

The longest cable is about 24,000 miles long. It extends from Germany to Korea and even further south to Australia, hitting 39 different landing points along the way.

Submarine Cable MapSeaMeWe-3 underwater internet cable, ready for service in September 1999.

The first transcontinental cable was laid down in 1858, and ran from Ireland to Newfoundland.

There are a few different types of cables used underwater, ranging in thickness from a garden hose to about three inches in diameter. The lightest (far right) are laid primarily in the deep ocean floor.

At the heart of the cables are the fibre optic wires that transmit information, protected by water-resistant petroleum jelly and layers of stranded metal.

Laying each cable down requires several months, millions of dollars, and a very large ship with miles of cable coiled up onboard.

Some cables are laid as deep as 25,000 feet below the surface of the ocean, meaning they’re subject to damage from natural disasters, corrosion, fishers, and even shark bites.

Flickr/Official U.S. Navy PageDivers remove corroded zinc anodes from an undersea cable near Hawaii.

Break repairs are handled by special ships with small hooks that pull the cable up or cut it in two and bring both halves up for mending. At least 50 cable breaks a year happen in the Atlantic alone, according to <a href=”” target=”_blank”>MIT Tech Review</a>.


The cables come back to shore at cable landing points and make their way to data centres by travelling underground. Maintenance and planning for underground cables is easier than underwater cables in some ways (like the fact that they don’t have to deal with shark bites) but still challenging in other ways.

David GreerHibernia Atlantic transoceanic cable landing, Lynn MA

In the US, there are 542 cables (depicted by the yellow lines) connecting at 273 different points (depicted by the blue squares).

The first publicly available map of the US’s cable network wasn’t available until 2015. It took Paul Barford and his team of researchers almost four years to pull it together.

The ecosystem of cables depends largely on the country’s infrastructure. In the US, for example, most of the long-haul cables are located along major roads and railways.

For cables under dry land, construction is a big concern. To prevent the cables from being dug up, they’re laid alongside gas pipes or inside old pipelines, with aboveground markers along the way.

David GreerUnderground fibre optic cable marker, Yorkville CA

Similar to underwater cables, cables in dry ground are subject to damage from natural disasters, like earthquakes.

David GreerUnderground fibre optic cable marker, New Jersey

The cables eventually reach the aforementioned data centres, and navigate to the machine servers.

David GreerFacebook data center, Des Moines IA

These are typically unmarked buildings located in both rural areas far outside of city limits…

David GreerGoogle data center, The Dalles OR

…and in buildings within highly populated cities, hidden in plain sight.

David GreerOne Wilshire data center & Telecom Center LA, 624 South Grand Ave. & 530 W. 6th St., Los Angeles CA

In fact, one of the world’s most concentrated hubs in terms of internet connectivity is located in lower Manhattan at <a href=”″ target=”_blank”>60 Hudson Street</a>.

A company called Telx operates out of the 9th floor, where local, national, and global channels come together to transmit data.

And there are two other major hubs in New York, located at 111 Eighth Avenue — the old Port Authority building that Google recently purchased for $US1.9 billion — and 32 Avenue of the Americas.

David GreerAT&T Long Distance Building, 32 Avenue of the Americas, New York NY

Each data center consumes massive amounts of energy. Apple recently built two <a href=”″ target=”_blank”>100-acre solar energy installations</a> to help power its North Carolina data center, which requires 20 megawatts of power at full capacity. That’s enough to power a little over 3,000 homes.

David GreerApple’s 14MW solar array, Maiden NC

Pretty high-maintenance, but necessary.

They’re filled with “deafeningly noisy rooms cocooning racks of servers and routers,” where you’re “buffeted by hot and cold air that blusters through everything,” according to designer and artist Timo Arnall who documented a large <a href=”” target=”_blank”>European data center</a> called Telefónica.

Timo Arnall/VimeoData center run by Telefonica in Alcalá, Spain

On land, the ceilings have to be 12 to 14 feet high to support rising heat from the servers. Philadelphia Internet Exchange’s ceilings, for example, has 12-foot ceilings.

David GreerPhiladelphia Internet Exchange, 401 N. Broad St. Philadelphia PA

They generate so much heat, in fact, that providers often try to place them in cooler countries to save on energy bills. Microsoft has been trying to figure out an even better solution and <a href=”″ target=”_blank”>just placed one under the sea</a> off the coast of Orkney (near Scotland), after years of experimenting with underwater data centres.


The data center, referred to internally as “Project Natick,” will be 117 feet below sea level for five years, which is four years and seven months longer than the last one Microsoft tested. At 40 feet in length, it’s much smaller than most data centres and holds 12 racks and 864 servers. The whole thing is being powered by a cable running from Orkney – a major hub for renewable energy.

If data centres don’t sound like a place you want to spend your time, keep in mind that you don’t really have a choice: data centres are very difficult to get into. The bigger data centres like Telefónica have “security far higher than any airport,” said Arnall, who had to get special permission.

David GreerLobby at 32 Avenue of the Americas, New York NY

From the outside, these unassuming buildings serve as the most glaring proof we have that the internet is more physical than we think.

David GreerTelx ATL1 and ColoAtl, 56 & 55 Marietta St., Atlanta GA

A constant aboveground reminder of everything it takes to keep the world wide web afloat.

David GreerNAP of the Americas, Miami FL

NOW WATCH: Tech Insider videos

Business Insider Emails & Alerts

Site highlights each day to your inbox.

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.