OnQ Blog

Weathering the storm: How AI can protect data in the cloud


Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.

Chris Wade is a senior critical facilities management professional with over 25 years of experience in the design, operation, and maintenance of complex mission-critical environments. The views expressed are the author’s own, and do not necessarily represent the views of Qualcomm.

For all their security protocols, data redundancies, and lightning-fast optical networks, data centers — the sprawling facilities filled with rows of servers where “the cloud” is stored — aren’t immune to risk. Despite our best efforts, the facilities are not fully protected from forces beyond a data-center risk manager’s control, such as natural disasters and severe weather.

The effects of outages caused by severe weather can be immediate to customers. For example, severe flooding in Leeds took down Vodafone’s British data center, triggering service disruptions over the Christmas holiday. A line of severe storms packing winds up to 80 miles per hour caused an outage at an Amazon Web Services center in Virginia, interrupting service for Instagram, Pinterest, Netflix, and others for over an hour in 2012. Ironically, even the National Weather Service (NWS) recently had a hiccup in its satellite broadcast network due to severe weather.

In addition to being an annoyance for customers, outages can imperil the long-term health of a company. A recent study by Ponemon Institute found that the average cost of a data center outage is over $740,000, up from just over $500,000 in 2010. Beyond money, an unplanned outage can cripple a company. To put that in perspective: 43 percent of companies that experience disasters never reopen, and 29 percent close within two years. Prolonged downtime makes matters worse: When data banks are down for more than 24 hours, 40 percent of businesses fail; up that number to 10 days, and a massive 93 percent go bankrupt within a year.

Given all of this, it’s clear why many of the largest tech companies in the world keep meteorologists on staff. These experts are responsible for planning ahead based on current conditions, and making key decisions, such as whether or not to migrate vital data to another, safer location. That’s a great plan for the big guys (the Netflixes and Instagrams of the world), but today the little guys rely heavily on cloud storage, too. What can they do to be proactive and protect their assets?

The answer could come from weather data itself. As climate-tracking giants like The Weather Company and the National Weather Service increasingly source data from more-precise crowdsourced weather services, the very information stored in massive data centers could help spot and prepare for extreme events. What’s needed are virtual weather robots that could signal the systems to automatically create new redundancies and move vital data offsite when threats are imminent. The data would literally be saving itself.

The root of the shift to artificially intelligent meteorologists could be greatly enabled by a growing network of personal, connected weather stations — ones that either augment or replace data from existing commercial weather monitors. Right now, the National Weather Service maintains roughly 12,000 stations, but the sparsity of the units can leave huge gaps and increase the chance of inaccurate weather readings. Personal weather stations, such as Netatmo, provide detailed reports on the current temperature, CO2 pollution, humidity, rain, and wind, all of which feed into a massively crowd-sourced weather map. Similarly, Weather Underground taps a worldwide network of stations maintained by hobbyists in their homes.

Apps that turn smartphones into roving weather stations are broadening datasets even further. Sunshine, for one, boasts a network of millions of smartphones on the ground, allowing it to produce weather results that are three times as accurate as NWS maps. WeatherSignal collects data in the background through onboard smartphone sensors to build its live maps. And AccuWeather’s AccUcast uses a similar methodology.

Of course, tracking weather is only part of what our virtual meteorologists would have to do and understand. Migrating data is very complex and can be lossy — the same way a copy of a copy is never quite as sharp as the original. And, even in the best circumstances, data migration causes downtime. The AI will need to constantly weigh risk when deciding how to respond to a looming weather event; on occasion, it may decide it’s a safer bet for data to stay put than be moved.

Thankfully, this level of complexity is well within the capabilities of current AI systems. Look, for example, at how Google is using AI to manage security risks in its Android mobile operating system. And tech giants like Microsoft and IBM are already training AI to crunch data in the hopes of improving weather-prediction models.

Perhaps most important to our proposed virtual data-center meteorologist is that today’s state-of-the-art AI can learn and improve over time. That will reinforce successful decisions (migrating data out of a server in the path of a tornado) and diminish bad ones (moving data when the chance of flooding is unconfirmed). These experiences would add up to a system that’s mastered how to best keep information safe and services online.

Ultimately, automating weather monitoring could open up a new type of secure data center for a broader range of companies. Business owners will be able to rest easy, knowing that there’s always a watchful — albeit virtual — eye on their data, ready to make the timely decisions that keep businesses running. And, when the storms pass, customers will be none the wiser.