The concept has certainly achieved buzzword status, so now all that remains is for everyone to understand it in order to see how it fits into any given organization and/or enterprise.
According to SAS Institute Inc., "Big data is a popular term used to describe the exponential growth, availability and use of information, both structured and unstructured. Much has been written on the big data trend and how it can serve as the basis for innovation, differentiation and growth."
This certainly is not a new conversation. IT executives have been struggling for years with the need to incorporate any and all kinds of data, because they believe -- and they may be correct -- that whoever leverages more and better data has a distinct advantage in the marketplace.
There is no question that economic conditions over the last five years have also fueled the desire to get the most out of data, both internal and external, especially "unstructured" information that doesn't necessarily fit the fields of any existing database, but is nonetheless important to include in the decision process around both marketing and internal operations.
Several research sources identify the key factors in big data and its use as volume, variety and velocity.
It is well known in IT circles that the volume of data handled by today's computer systems is staggering -- and is growing at a rate that often outstrips our ability to deal with it in terms of storage and utilization.
According to Gartner, "Worldwide information volume is growing annually at a minimum rate of 59 percent ... ."
The increase in data within enterprises is caused by transaction volumes and other traditional data types, as well as by new types of data.
"Too much volume is a storage issue, but too much data is also a massive analysis issue," Gartner said.
When it comes to variety, the research and advisory company pointed out that today we have many new types of information to analyze, such as data arising from social media sources and mobile platforms.
"Variety includes tabular data (databases), hierarchical data, documents, e-mail, metering data, video, still images, audio, stock ticker data, financial transactions and more," Gartner said.
According to the company, velocity involves both how fast data is being produced and how fast the data must be processed to meet demand.
"RFID tags and smart metering are driving an increasing need to deal with torrents of data in near-real time. Reacting quickly enough to deal with velocity is a challenge to most organizations," according to SAS.
At first blush, getting a handle on all this big data may seem to be a daunting task that will involve significant investment in both storage media/services and analytical services and personnel.
Still, we must be ready and able to use any and all forms of information to keep ourselves ahead of the pack in this extremely competitive market.
As with many game-changing technologies, insurers must ask themselves not only whether they can afford the additional investments, but whether they can afford to not make such investments.
The same tools and technologies are available to all the players, but it is the early adopters who stand to gain the most from effective use of big data.
ARA TREMBLY is founder of The Tech Consultant and The Rogue Guru
Blog. He can be reached at email@example.com.
February 19, 2013
Copyright 2013© LRP Publications