A simple question. I'm interested in finding a comprehensive study that explores the relationship between the power consumed in Watts in a Data Center and the throughput achieved (i.e FLOPS or maybe some other unit). Does anyone happen to know any such study?
I guess I could extrapolate by finding studies of power consumption and server utilization (usually a convex curve) and multiply that by the number of servers that work in a Data Center, but I think I'd like to be more precise if it is possible.