Predictive Analytics & Data Centers: A Technology Whose Time Has Come

Back in 1993, ASHRAE organized a competition called the “Great Energy Predictor Shootout,” a competition designed to evaluate various analytical methods used to predict energy usage in buildings.  Five of the top six entries used artificial neural networks.  ASHRAE organized a second energy predictor shootout in 1994, and this time the winners included a balance of neural networks and non-linear regression approaches to prediction and machine learning.  And yet, as successful as the case studies were, there was little to no adoption of this compelling technology.

Fast forward to 2014 when Google announced its use of machine learning leveraging neural networks to “optimize data center operations and drive…energy use to new lows.”  Google uses neural networks to predict power usage effectiveness (PUE) as a function of exogenous variables such as outdoor temperature, and operating variables such as pump speed. Microsoft too has stepped up to endorse the significance of machine learning for more effective prediction analysis.  Joseph Sirosh, corporate vice president at Microsoft, says:  “traditional analysis lets you predict the future. Machine learning lets you change the future.”  And this recent article advocates the use of predictive analytics for the power industry.

The Vigilent system also embraces this thinking, and uses machine learning as an integral part of its control software.  Specifically, Vigilent uses continuous machine learning to ensure that predictions driving cooling control decisions remain accurate over time, even as conditions change (see my May 2013 blog for more details).  Vigilent predictive analysis continually informs the software of the likely result of any particular control decision, which in turn allows the software to extinguish hot spots – and most effectively optimize cooling operations with desired parameters to the extent that data center design, layout and physical configuration will allow.

This is where additional analysis tools, such as the Vigilent Influence Map™, become useful.  The Influence Map provides a current, real-time and highly visual display of which cooling units are cooling which parts of the data floor.

As an example, one of our customers saw that he had a hot spot in a particular area that hadn’t been automatically corrected by Vigilent.  He reviewed his Vigilent Influence Map and saw that the three cooling units closest to the hot spot had little or no influence on the hot spot.  The Influence Map showed that cooling units located much farther away were providing some cooling to the problem area.  Armed with this information, he investigated the cooling infrastructure near the hot spot and found that dampers in the supply ductwork from the three closest units were closed.  Opening them resolved the hot spot.  The influence map provided insight that helped an experienced data center professional more quickly identify and resolve his problem and ensure high reliability of the data center.

Operating a data center without predictive analytics is like driving a car facing backwards.  All you can see is where you’ve been and where you are right now.  Driving a car facing backwards is dangerous.   Why would anyone “drive” their data center in this way?

Predictive analytics are available, proven and endorsed by technology’s most respected organizations.  This is a technology whose time has not only come, but is critical to the reliability of increasingly complex data center operations.

IMG_7525_cliff250

A Look at 2013

We grew!

We moved!

We’ve had a heck of a year!

In 2013 alone, we reduced (and avoided the generation of) more than 85 thousand tons of carbon emissions from the atmosphere.

This is a statistic of which I am very, very proud and one that clearly demonstrates the double bottom line impact of the Vigilent solution.

We have directly impacted the planet by reducing energy requirements and CO2 emissions, even as the demands of our digital lifestyles increase.  We have impacted individual quality of life by increasing uptime reliability and contributing to the safety of treasured documents and photos, as well as helping to ensure the uninterrupted transmission of information that makes our world operate.  We are honored and privileged to contribute so directly to the well-being of our world and our customers.

While analysts have cited a DCIM market contraction in 2013, Vigilent has thrived.   We attracted new customers and engendered even deeper loyalty among existing customers – evidenced by our organic growth as one deployment turns into 3, then ten, then dozens across the United States when actual energy savings and thermal condition insights are realized.

I am pleased to share some of the milestones we achieved in 2013:

We moved to terrific new facilities in uptown Oakland.  Not only does our new facility (within a literally green building)  provide us with space for in-house product commissioning and expanded R&D,  it provides a vibrant collaborative atmosphere for employees.  The new location is adjacent to public transportation, honoring our commitment to a green corporate culture, and offers dozens of great restaurants, coffee shops and diverse entertainment options for employees.

We grew – in revenues, in customer base, into new markets and with staff.  With growth comes responsibility to provide more directed  leadership in business functions and market focus.  With this in mind, we expanded our executive management staff, hiring  Dave Hudson to oversee sales and operations worldwide, and  Alex Fielding to introduce Vigilent to federal markets and many new field engineers, software engineers, QA and support staff.

We expanded our product offering with new functionality including out-of-the-box reports that help with energy savings, SLA adherence, maintenance and capacity planning.  We continued to refine our trademark intelligence and control functionality enhancing both usability and energy savings in ever more complex data center environments – achieving an additional 30% savings in some cases.

Ultimately, all of this helps our customers succeed not only in direct bottom line impact, but with large-scale sustainability efforts that are widely recognized.  Avnet used the Vigilent system in corporate sustainability initiatives that garnered the company the Uptime Institute GEIT award, as well as recognition by InfoWorld as a top Green IT award winner.    Our sales partner, NTT Facilities, continues to roll out  Vigilent deployments in Japan.

Our ability to contribute to the Federal Government’s initiative to consolidate data centers and reduce overall energy savings is significant indeed.  Watch this space.

With a great year behind us, we recognize that there is much to do, as the data center industry – at last – is realizing how significantly data and analytics can improve day to day operations and efficiency endeavors.

The Emerson-Poneman Institute recently issued a study on Data Center outages that states accidental human error remains in the top-3 cited reasons for downtime and that 52% of survey respondents believe these accidents could have been prevented.

Intelligent software control and analytics will help operators make better,  more informed decisions and reduce such human errors.   These tools will increasingly help data centers proactively avoid trouble, while at the same time helping them diagnose and resolve actual issues more quickly.

This will be the year of analytics for data centers.  Vigilent is equipped and prepared to lead this charge, leveraging years of institutional knowledge we have gleaned  from hundreds of deployments in every conceivable configuration in mission critical facilities on four continents.  This mass of data influences the analytics we use to engage individual control decisions at every site, and also, more recently, places the benefit of this accumulated knowledge into the hands and minds of data center managers for more informed process management.

Happy New Year.