Vigilent is the new name of long-time data center facility management technology company Federspiel Controls. Since 2004, it has built software to connect with the chillers, air handlers, variable-speed fans, and all the other gear that keeps servers cool enough to do their jobs, as well as the sensors that tell temperature, humidity, wind speed and other key variables.
That’s a lot of data, and it has to be handled in a lot of different ways. But Vigilent CEO Mark Housley says its “big data” solution can out-learn and out-perform competing data center efficiency platforms. Monday’s investment adds Vigilent to Accel’s new $100 million Big Data fund, one of several VC funds aiming at a space that gathered $2.47 billion in venture investment last year, according to Thomson Reuters.
“We do intelligent energy management, which is energy management that actually does something besides tell the customer they’re ugly and give them a list of things to fix,” Housley said in a Monday interview. The El Cerrito, Calif.-based company has about 90 deployments with customers including Verizon, NTT, Akamai and Lawrence Berkeley National Laboratory, and is growing fast -- some 50 of those projects have come in the last year, he said.
Vigilent concentrates on the facilities side of data centers, and so far has about 6.9 million square feet of data center space under management, representing 23.2 megawatts of power use. Vigilent’s customers have achieved average energy reductions of 40 to 50 percent, with overall energy savings of $6.4 million a year, Housley said.
Its projects range from dedicated data center environments to central office locations not built with efficiency in mind, and include enterprise-wide deployments as well as individual buildings.
As importantly, Housley said, Vigilent has come up with a way to measure and analyze masses of data over the course of months and years of operation, and run that data through thousands of permutations to come up with a living, learning model of how power flows through the data center as a whole over time.
Most of today’s data center infrastructure management IT concentrates on doing its job in real time, checking to make sure everything is working properly and nothing is breaking down, he said. Data center sensors can pull data multiple times per minute to provide a real-time check on operations, which means a lot of data -- but it’s relatively simple data, he said.
It’s harder to manage historical data, to find patterns of over-provisioned cooling, equipment wear and tear, imbalances in IT load and cooling load, and all the other patterns that reveal hidden energy-wasting behavior, he said. That’s because mixing and matching those thousands of real-time variables, over time and in more and more combinations, quickly adds up to an astronomically huge data management problem, he said.
Then, of course, you’ve got to bring the most important numbers of all into the mix -- the dollars-and-cents equations that determine which data center efficiency projects offer the fastest and safest return on their investment. That adds a level of complexity that can be baffling to the data center manager and the CFO alike.
“Our secret sauce is to figure out how to reduce that data and make sense of it,” Housley said. While he didn’t get into technical details, the approach combines concepts such as statistical data processing, complexity theory, artificial intelligence and other such developments in the big data field, he said.
Vigilent is far from the only company making claims in the data center efficiency field, of course. We’ve got startups including Power Assure, Sentilla, SynapSense and JouleX tackling the challenge, as well as giants like HP, IBM, Intel, ABB, Cisco, General Electric and Schneider Electric. We’ve also got partnerships between the two groups, and some acquisitions -- Schneider bought startup Viridity Software in December, for example.
Because the amount of energy savings available for each deployment varies greatly based on how sophisticated the data center was in the first place, it’s hard to draw direct comparisons between vendor offerings. We’ve got everything from Facebook and Google building their own super-efficient, custom-built data centers, to your central office IT managers struggling to fit more computing power inside rooms strapped for extra space and power and cooling capacity.
What’s for sure is that energy costs are a growing concern for the data center industry, and that’s driving a wave of investment in efficiency solutions. We’ve seen projections for the U.S. “green data center” market to grow from $3.82 billion in 2010 to $13.81 billion in 2015, and another report predicts the global market will grow from $7.5 billion in 2010 to $41 billion by 2015, or more than one-quarter of overall data center spending around the world.
Vigilent was cash-flow positive last year and has funded its expansion to date through revenues, Housley said. At the same time, Vigilent’s software typically is deployed via “transparent overlay” on top of other companies broader data center infrastructure management (DCIM) platforms, he said. That’s a fairly common situation in the data center efficiency space, where companies are mixing and matching one another’s capabilities to deliver deep data analytics and automated control capabilities to their clients.
Just what combinations and capabilities become most popular with the market remains to be seen, of course. Eventually, we could see data centers tapping local power generation sources like solar panels or fuel cells to power operations, or selling power reductions back to the grid for demand response.
But as with any other aspect of running a data center, it’s keeping servers running at all costs that drives decision-making. Vigilent and others promising big energy efficiency gains will have to prove they can deliver without compromising core operations -- and for that, having deep-pocketed partners will likely prove a big advantage.