Google-Datacenter-640x353

Details on Googles massive cloud infrastructure revealed

Details on Googles massive cloud infrastructure revealed

 

The company, however, has always been reticent when it came to sharing the details of its architecture a not-unreasonable precaution given that its entire business rest on these resources. According to Vahdat, Google began pioneering the use of software-defined networking, which decouples the control plane (which makes decisions about where traffic is sent) ten years ago. The company uses arrays of small, cheap switches to provide some of the same capabilities as much more expensive hardware, then manages workload distribution in software.

Right now, Google’s planetary network relies on a platform, codenamed Jupiter, that can provide more than 1Pb/s of bisection bandwidth. That’s an important distinction, as the term refers to the bandwidth between two equal parts, not the bandwidth capability of every part. As Google notes, that’s sufficient for 100,000 servers to exchange data at 10Gb/s each, or to read the scanned contents of the library of Congress in about 1/10 of a second.

Read Also:
Why has Standalone Cloud BI Been Such a Tough Slog?

Many of the “open” shifts in datacenter management and provision have been described as cost-saving measures, often with baleful glances cast at the likes of Cisco or Intel. That’s undoubtedly part of the underlying trend, but Google, Apple, or Amazon can easily afford to pay top-tier pricing for premium hardware. Vahdat’s remarks reflect a different reality, one where software provisioning and flexible design were required to compensate for the difficulty of scaling conventional network infrastructure up to match the needs of data centers.

While they serve different purposes, much of the research into efficient datacenter communication, SDN, and faster, lower-power interconnects has ramifications for scientific computing and the HPC ecosystem. Moving bits for minimal power at maximum speed is as important to scientific clusters as it is to datacenters. Bringing down interconnect power is a high priority for DARPA, as well as firms like Intel and IBM.



Data Science Congress 2017

5
Jun
2017
Data Science Congress 2017

20% off with code 7wdata_DSC2017

Read Also:
How Big Data, Cloud, IoT and Artificial Intelligence create opportunities in the geospatial industry?
Read Also:
Why has Standalone Cloud BI Been Such a Tough Slog?

AI Paris

6
Jun
2017
AI Paris

20% off with code AIP17-7WDATA-20

Read Also:
How Digital Health Startups Can Leverage Intellectual Property

Chief Data Officer Summit San Francisco

7
Jun
2017
Chief Data Officer Summit San Francisco

$200 off with code DATA200

Read Also:
Is Big Data making CIOs smarter?

Customer Analytics Innovation Summit Chicago

7
Jun
2017
Customer Analytics Innovation Summit Chicago

$200 off with code DATA200

Read Also:
The CEO of £1.4 billion software giant Xero says AI will be 'transformational' for finance

HR & Workforce Analytics Innovation Summit 2017 London

12
Jun
2017
HR & Workforce Analytics Innovation Summit 2017 London

$200 off with code DATA200

Read Also:
At Teradata, Flexible Licensing Choices Change Everything

Leave a Reply

Your email address will not be published. Required fields are marked *