Prepping for the Data Deluge – CIO

Data is infiltrating many organizations at an astounding rate, whether the industry is healthcare or entertainment, and regardless of whether the company is a Fortune 500 behemoth or a small- to mid-sized business. As companies consume terabytes of data culled from Internet behavior, social media, and Internet of Things (IoT)-connected assets (among other things), they run the risk of drowning in a deluge. If that data volume is properly managed and leveraged, however, it can be the leg up on competition—and business success.

According to a survey of 2,300 global business and IT leaders by MIT Technology Review Insights in association with Pure Storage, nearly 90% of respondents believe data is the key to delivering better results and future growth, especially for shaping a more personalized customer experience.

Yet those same respondents are worried about capitalizing on the bounty: 90% of responding companies voiced widespread concern about their ability to analyze the data given surging data volumes, quality, and speed.

In fact, it is the companies’ ability to analyze this new data that may be the greatest impediment. “There’s a misconception that capturing the data will provide inherent value, but instead, businesses struggle to access the data as well as to understand the data they have captured,” notes Diana Nolting (@DianaNolting), director of product for Anvl. “The biggest complaint we hear is that companies are simply visualizing data, not analyzing it for action.”

The MIT study found the speed at which data can be received, analyzed, interpreted, and acted upon is a key barrier for 84% of companies. At the same time, 87% agreed that data needs to be analyzed for meaning and context. So what’s the best path forward? While there is no one-sized-fits-all roadmap, there are a number of steps enterprises should take immediately.

A Data Deluge Protection Plan

Building out a strategy and creating a data management foundation is the critical first step to ensure maximum value of data assets, according to Will Wilkinson (@wawilkinson), head of infrastructure presales at CANCOM UK. As part of this early process, organizations must do the work of qualifying the data and determining which data sets are important—including, what is necessary to the core business objectives, and what is irrevocably out of date.

“Enterprises are still struggling with duplicated and outdated data due to the lack of a cohesive data management strategy,” says Larry Larmeu (@LarryLarmeu), enterprise transformation leader. “It is important that enterprises provide the tools, processes, and guidance around storing and accessing data, allowing for ease of integration, increased data veracity, and unlocking the valuable insights available from real-time data analytics.”

Proper data governance is also central to an effective data insurance plan; it should cover all ends of the spectrum, from availability and operations to storage, retention strategies, and data security. “How much to keep, where data goes, how it’s protected—these are all problems that have always existed, but they will be greatly magnified with the current technology trends,” says Mark Thiele (@Mthiele10), edge computing engineer at Ericsson.

“With the availability of so much information, I have had to add a host of tasks to my schedule such as determining what data is valid, reliable, complete, on target for my needs, and readily accessible,” says David Geer (@geercom), a cybersecurity technology writer. “I have achieved success by prioritizing the data that most quickly and easily confirms that it meets all these requirements.”

Companies should pay special attention to consistent classification and labeling of data, as it’s one of the biggest hurdles to effective data governance. Setting default labels for new data (for example, dubbing them confidential) can ensure that policies and technical controls are applied consistently across the organization. This also frees up data creators from having to manually label all newly created information. This can be the result of a poor life style as well as unhealthy eating habits, smoking, alcohol, wholesale generic cialis some medications, drugs, Candida-yeast overgrowth, and acidic, aggressive bile suppress the pancreatic cells and keep them from producing the required amount of good quality digestive enzymes and bicarbonate. In this 50mg viagra sale article, we will take a look at some of the most popular herbal products and see what each has to offer. Shoppharmarx.com is the best place , in term of the prices of cheap online levitra pills, and the quality of both are same. Sildenafil citrate reverses the effect of PDE5 buy cialis from india enzymes. “In that way, a data steward only needs to review data labels when that data is crossing a security barrier such as preparing a file to send to a client or third-party vendor,” notes Kayne McGladrey (@kaynemcgladrey), director of security at information technology at Pensar Development.

Not to be overlooked: how to optimize storage for growing data stores and how to effectively expose insights so organizations can optimally benefit. For HBO, the cloud was a huge leg up. The company’s already burgeoning customer video usage data was compounded by the release of the HBO GO service and additional traffic created in social media. “The information provided us with an opportunity to better understand the customer interaction with our products, but it also provided a storage and data analytics challenge,” says Michael Gabriel, former executive vice president and CIO at HBO. “Being an early adopter of the cloud over a decade ago provided us the time to determine usage patterns and storage retention requirements on an on-going basis.”

Beyond storage and security, the ability to visualize results and present it to users in a form that is easily understandable had Deepak Puri, founder of Skilled Analysts, scrambling. This was particularly true during a political and advocacy campaign that involved large data volumes such as voter files and survey results. The solution: present the data in an interactive map. The move was such a success, and it was published in Newsweek, Puri says.

Creating centers of excellence and investing in data science skills and talent is all part of the process, notes Wayne Anderson (@DigitalSecArch), enterprise security architect at McAfee. In addition, getting ahead of emerging technologies in areas like machine learning and artificial intelligence (AI) will be essential for making sense of all the data. This includes an assist automating categorization and storage as well as what is ultimately thrown away, says Scott Schober (@ScottBVS), president and CEO of Berkeley Varitronics Systems Inc.

While establishing a formal governance plan can seem onerous, without it, data quickly turns from asset to liability. “The data deluge can become cumbersome and risky,” cautions Jason James (@itlinchpin), CIO at Optima Healthcare Solutions. “To paraphrase Spider Man, ‘with great data, comes great responsibility.’”

For more information on Pure Storage, visit www.purestorage.com.