The main reason Amazon Web Services is so successful is it has taken the expense of building a data center or leasing data center space out of the equation for tech startups and today, increasingly, for more traditional established enterprises. The online retailer’s cloud infrastructure business now rakes in more than $7 billion in annual revenue.
Managing data center capacity, making sure there’s always enough of it to support business growth, is a complicated and costly exercise. By giving users the ability to spin up additional virtual servers on the fly as they need them and to spin them down when they don’t, paying only for what they actually use, AWS has changed IT capacity management forever.
But at least today, whether because of regulations, security concerns, or performance issues, public Infrastructure-as-a-Service doesn’t work as an infrastructure option for all companies and all applications – although AWS and its rivals like Microsoft are working hard to make sure that it does – which is why the physical data center provider industry continues to thrive. Companies like the control and performance of dedicated infrastructure they get when they deploy servers at colocation data centers.
But colocation doesn’t offer the type of elasticity companies get through IaaS, and since there’s clearly demand for both full control and elasticity, some service providers attempt to devise solutions that offer a compromise between the two.
It’s difficult to make the numbers work if a company adds capacity incrementally. Building at scale can yield much higher profit margins if you can find customers to utilize that capacity. If you don’t, however, you end up with stranded capital.
This is why modularity, as a concept, is huge in data center design. Modularity implies quick expansion in small increments. The concept is employed in everything, from mechanical chillers or uninterruptible power supply units to computer rooms or even shipping-container-like modules that are basically entire self-sufficient data centers.
One part of the equation required to make modularity work is quick installation of the modules once onsite, and the other is a well-oiled supply chain that is prepared to churn out modules quickly enough when they are needed.
On-Demand Data Center Capacity
A data center service provider formed just seven months ago is using modular cooling systems designed by a sister company to introduce a higher degree of elasticity to the colocation model.
When it finishes construction of its first data center in Plano, Texas, Aligned Data Centers is promising its future customers the ability to pay only for the power capacity they use, including the ability to scale up and, importantly, down. It is the scaling down part that’s a lot harder to address with physical infrastructure than it is with VMs in a public cloud.
For Aligned, the flexibility comes from the cooling system, designed by its sister company Inertech. The system – in use at eBay, Lenovo, and Telus data centers, among others – can add or reduce capacity 300 kW at a time, Thomas Doherty, COO at Aligned, said. “It’s really a variable infrastructure that can scale up [or] scale down,” he said.
Although not exactly instantaneous like IaaS – that 300 kW of additional capacity can be deployed in about eight weeks – it is a compromise. To make sure the model works, the company has established a “deep” and tightly controlled supply chain, Doherty said.
Aligned and Inertech are parts of a holding company called Aligned Energy. Backed by the hedge fund BlueMountain Capital Management, it includes two other affiliates: Energy Metrics, which sells data center infrastructure management software, and Karbon Engineering, a data center consulting, design, and commissioning services organization.
Aligned is nearing completion of the first phase of its first 300,000-square-foot data center in Plano, which will have power capacity of 30 MW. The company has also started construction of a 550,000-square-foot, 65 MW facility in Phoenix and says it is going through the site-selection process for its next build, exploring locations in California, Illinois, Virginia, and New Jersey.
Inertech’s Modular Cooling System
A key component of Inertech’s cooling system design is the Thermal Hub, which replaces the traditional power-guzzling mechanical chillers. These Hubs suck in the heat from the data center floor and push it out.
A 100-ton Hub needs 500 watts, compared with the 90,000 watts required to deliver the same cooling capacity using a traditional chiller and chiller pumps, according to Inertech’s website. The calculation assumes that free cooling is used and that outside air temperature is the same in both scenarios.
The Hub is the central modular component of the system. More can be deployed quickly as needed, or removed. If a customer has overshot in their capacity calculations and needs to scale down, a hub can be moved to serve another customer in the facility.
In Aligned data center in Plano, the Hubs will be part of a “spline” in the center section of the facility, Doherty explained. Heat travels from IT racks to the Hubs from heat sinks that sit above fully contained aisles, carried by pumped refrigerant. They use fans to take heat away from the servers.
Once the heat is at one of the Hubs, it is taken out of by a Thermal Bus. A “super-highway for heat transfer,” in Inertech’s words, it is a more energy efficient alternative to forced air systems. The Bus is another quick-connect, modular component that can be added or removed as needed.
On the final leg of its journey, heat travels via a heat rejection loop from the Bus to whatever heat rejection solution the user has, be it a fluid cooler, a cooling tower, a dry cooler, and so on.
Addressing Common Disconnect
Aligned will bill customers monthly, based on the power capacity their equipment used as opposed to the amount of capacity deployed to support them, Doherty said. He declined to disclose cost per kW, but said it would be competitive with market rates.
He is confident the need for a colocation product that better aligns the amount of infrastructure deployed for a customer with their actual use is there. In conversations with big banks, for example, there is a common complaint about disconnect between their data center lease agreements and their energy bills, Doherty said.