What do the 1950s military avionics technology and bitcoin mining servers have in common? In a word – heat, or, more precisely, cooling requirements. Blowing cold air over the innards of military aircraft and satellite electronics 60 years ago was as inefficient as blowing cold air through power-hungry bitcoin mining servers today. It was in the 50s that 3M introduced its first dielectric fluid, which at the time found its first application in cooling avionics systems.
In the decades that followed, the single biggest application for 3M’s fluids has been in supercomputers, which because of their power density were much better off cooled with liquid than air. Over the last several years, however, as the bitcoin mining industry grew and reached a point where the biggest players were building their own mining hardware and data centers to house it, cooling electronics with dielectric fluid found its second big application.
Power densities in bitcoin mining data centers are radically higher than in data centers that house traditional IT equipment, and operators of these facilities tend to squeeze every last watt and square foot they can out of them. Some of them have found that by bringing bitcoin mining ASICs, or Application Specific Integrated Circuits, in direct contact with dielectric fluid allows them to pack a lot more mining horsepower in a square foot of data center space.
Non-HPC Liquid Cooling at Massive Scale built by Bitcoin
A data center being built in Georgia (the former Soviet republic) is one of the world’s biggest showcases for the most unusual of approaches to liquid cooling: submerging servers in fluid completely. The facility is being built by the bitcoin mining giant BitFury, and the cooling system was designed by Allied Control, a Hong Kong-based engineering company BitFury recently acquired.
According to a case study of the deployment, published this week, BitFury expects its 40 MW facility to support 250 kW per rack at launch. This will not be the power-density limit of the design. The company expects it to support future-generation bitcoin mining hardware that will be even more energy-intensive. For comparison, typical power density of IT gear deployed in traditional enterprise or colocation data centers ranges between 2 kW and 5 kW per rack.
3M’s Novec 7100 fluid used in Allied’s design does not transfer heat very well, so it’s not enough to simply flush electronics with it continuously. But it does boil at a relatively low temperature of 142F. When servers submerged in a specially designed tub heat up, the liquid starts boiling, and the resulting gas carries the heat up. Once above the surface, it reaches water-cooled cooling coils, turns back into liquid as its temperature drops, and falls back into the tank.
3M actually created the concept itself but made it freely available for anyone to use, which is what Allied did, Michael Garceau, business development manager at 3M, said. The approach is called “two-phase immersion cooling,” or 2PIC. There is at least one other company that sells similar immersion-cooling systems for data centers, but there is one key difference. Green Revolution Cooling also submerges servers into a dielectric mineral oil blend it designed itself, but its approach does not use two-phase immersion. The company claims that its oil, called ElectroSafe, has 1,200 more heat capacity by volume than air.
Other approaches are filling sealed server enclosures with coolant or isolating coolant flow to CPUs alone. The latter approach involves installing a small chamber for coolant directly over the CPU and pushing coolant through it by a system of narrow pipes. The power-density advantages of direct liquid cooling and especially immersion cooling are clear. What is unclear is whether it will eventually become useful in more mainstream computing scenarios, which is something 3M is hoping to see.
The alarmist forecasts from about a decade ago of an imminent data center power density crisis have not materialized. Densities have generally gone up, but not to the extent or at the scale predicted.
“The reality is, only in the last two or three years have companies started to more broadly deploy high density to be able to take advantage of the efficiency that that drives,” Sureel Choksi, CEO of the wholesale data center provider Vantage Data Centers, said. “The average data center rack today, in terms of actual utilization, probably has density of about 2kW a rack, which is extraordinarily low.”
The higher the power density the more sense immersion cooling makes. 3M’s Novec fluid itself is a major cost, and at lower densities economics of the technology get “less compelling,” Garceau said. He declined to share the price of the fluid, but a report on the liquid cooling market by 451 Research said it could cost up to $50 per liter.
Garceau views the Emergence of Bitcoin Mining as the Second Big Application
Also undisclosed is the amount of fluid required to cool BitFury’s 250 kW racks. A previous project for which the data was disclosed used 3 liters per kW, Garceau said. In lab tests, 3M simulated a higher IT load and was able to cool 4 kW with less than one liter, he said. But Garceau’s current challenge extends beyond the bitcoin mining industry. The company wants to make the case that the cooling technology can be useful in more than just a few specialized applications. “There’s a tendency to believe that liquid cooling is special and expensive and niche for supercomputing,” he said.
At least today, however, that belief is not unfounded, since non-HPC deployments of immersion cooling are rare, and the overwhelming majority of the world’s IT gear is cooled by air. Garceau views the emergence of bitcoin mining as the second big application for direct liquid cooling as an opportunity to demonstrate that the story of the technology does not start and end with supercomputers.
One potential application can be cloud-scale computing, he said. “They would have to optimize around high density hardware for the purposes of energy efficiency, construction savings, all the benefits of two-phase immersion cooling,” Garceau said. But making the case to an industry that has an opposite philosophy of data center architecture – scale out versus scale up – will be difficult.