Next-Gen Data Center Interconnect Tech Enables Explosion of Online Video and Cloud Services

It takes a lot of sophisticated technology to pull off something like Yahoo and CBS’s first-ever global webcast of an NFL game in October that reportedly matched the quality of satellite TV broadcasts, but one of the fundamental pieces of infrastructure high-quality internet video couldn’t exist without is high-bandwidth, low-latency data center interconnect technology. These are boxes that push massive amounts of data over long-distance optical networks at ultra-high speeds from one data center to another.

The explosion of internet video is changing nearly everything about how the internet is built, from its geographic layout down to the specific interconnect technologies inside data centers and top-of-rack network switches that move packets between those optical interconnect boxes – also referred to as DCI – and servers that store and process the data. One big change happening on the DCI front is the transition to 100 Gigabit Ethernet, the standard defined five years ago for pushing unprecedented amounts of data over networks.

While online video is the biggest driver for 100 GbE data center interconnection, other applications, such as cloud services and enterprise disaster recovery or business continuity, are also contributing to the shift to higher-bandwidth DCI.

“Video is probably the number-one driver, because that’ driving significant demand on the public internet,” Ihab Tarazi, CTO of Equinix, the world’s largest data center colocation and interconnection service provider, said. Equinix data centers around the world are where much of the interconnection between network carriers, digital content companies, cloud service providers, enterprises, and internet service providers happens.

DCI vendor Infinera and data center networking switch supplier Arista Networks, both of whose products Equinix deploys widely, recently published test results of a joint DCI solution that showcases the kind of capabilities modern technology for shuttling data over long distances has.

The test, overseen and validated by independent data center networking technology testing organization called The Lippis Report, confirmed 100 GbE throughput at latency under 20 microseconds with zero loss for “any mix of traffic” end to end. That’s data traveling from a server in one data center through an Arista switch to an Infinera DCI box, over up to 150 kilometers of fiber into another Infinera DCI box at a remote data center, and through another Arista switch to another server in less than 20 microseconds, uncompromised. The solution was tested at 10 Gigabits per second and at 100 Gbps, according to the test report. The optical DCI transport platform was Infinera’s Cloud Xpress, tested with Arista’s 7280 switches.

There are also some target enterprise customers that have decided to move more of their infrastructure to the cloud and who can benefit by using 100-Gig DCI technology to interconnect their on-premise data centers with infrastructure of their cloud service providers.

Uptake for Infinera’s Cloud Xpress has been the strongest in the web-scale data center space, Jay Gill, principal product marketing manager at Infinera, said. “They’re the ones that really drove our Cloud Xpress product requirements,” he said.

Multi-tenant data center providers like Equinix are another category of customers creating demand for 100 Gig interconnect solutions.

A customer in an Equinix data center in Santa Clara, California, for example, can pay for a 100-Gig port on an Arista switch and, via Infinera, connect to an Arista switch at an Equinix data center in San Jose. The infrastructure essentially creates one virtual data center across the entire metro, Equinix’s Tarazi said.