In February, Risk & Insurance®
Managing Editor Cyril Tuohy conducted a question-and-answer session on the role of data in the insurance industry with Don Canning, vice president of SunGard's insurance business. The era of vast amounts of data is upon us and Canning explains what that means for the industry.
Q: The expression "big data" is cropping up more frequently in the mainstream. What does that mean and why is it called "big"?
A: "Big data" pertains to the legacy of data warehousing. Many data warehouse projects did not pay off for carriers over the past 10 years, when millions of dollars were spent to deliver only delayed fractional benefits. Moreover, they delivered results slowly, causing other teams to spring up, fractioning into departmental solutions that now have several, unlinked mini-data warehouses ? a rather expensive mess. Today, data warehouse technologies have gone the way of attempting to create a "turn-key appliance," like a black-box processing machine able to handle data across departments and business lines.
Q: In the context of risk management, what does "big data," refer to?
A: Risk management in the insurance business can be thought of as a layer cake, starting with a foundation where actuaries need data to perform asset liability management, stochastic modeling and seriatim processing. While most actuaries require at least seven years of history, many would like 10 years to 15 years, and others would go back 30 years if they could. The next layer is rolling up all the product types within a given division or line of business. This is primarily the aggregate data business line executives want to be able to drill down into the transaction level. Next, the chief actuary will want to gather actuary results from every department and division creating key risk indicators overseeing the risk appetite of the business. Having a bottom-up enterprise master data management strategy is key to giving executives line of sight over risk controls. The more data they have, the more granular risk monitoring the chief actuary can provide to the board.
Q: Who is responsible for generating this data, managing it, and where is it stored?
A: Ideally, the chief information officer should have a master data management strategy as a strategic initiative. This can also provide a new fabric of infrastructure that breaks the habit of drilling into policy systems for operational data. It also helps provide a standardized method for gathering and storing all data and driving legacy modernization initiatives.
Q: Who owns this vast universe of data?
A: Ownership is really about accessibility and the ability to respond to questions when asked about the data pedigree. Many master data management projects look to establish a "data custodian" who can tell you the nuances of one policy administration from the other. One person or team needs to take ownership of the information, its meaning and pedigree, as well as change management. Access and subscription involves engineering turning data into actionable information. This is where data architects and business intelligence analysts provide the feed, and end users use their tool of choice to view and analyze data.
Q: Is "big data" more expensive to manage than regular data?
A: "Big data" represents a unique set of hardware and software technologies that processes data using a special technique called massively parallel processing (MPP). If a carrier has 1 billion database rows, query processing taking 24 hours ? that may not be an acceptable response time for the actuaries. MPP enables "big data" by giving data architects the ability to split the data into equal, discrete chunks and reading them all at the same time ? resulting in executing that same query in about 20 minutes. It's not the size of the data that's crucial; it's how quickly you want it to be accessible. Costs today range from $17,000 to $50,000 per terabyte including hardware, software and possibly even a technician with a high-end platform.
Q: How are insurers leveraging data? What does "leveraging" data mean?
A: The actuaries typically use the largest amount of data and require the longest history ? especially in life and annuity type policies. Leveraging data means having the data accessible at the business user's finger tips in time to make the best business decision ? no matter the broker, TPA or anyone else in the supply chain. He who has access to the most accurate data the quickest will win the business every time.
Q: How will all this data affect the ability of corporate risk management to analyze enterprise risks?
A: If the chief actuary can build that layer cake, he will have line of sight on the carrier's overall risk. The data provides the baseline to build "shock tests" or balance sheet stress testing. Big data can be viewed the same way a greens keeper looks at the overall landscape of golf course sand traps and greens ? they look at the overall landscape of the rolling hills taking in the overall pattern and flow of the course. He'll look at the blades of grass for quality of the blade, not for measurement.
Q: Is there a limit to the amount of data that technology systems will be able to process? Is it possible to envision an age of big, bigger and biggest data?
A: There are few limits thanks to massively parallel processing (MPP), which allows carriers to add database nodes until the desired response time is achieved. This is one of the few examples in data processing where one can invest in a performance issue and see dramatic results. Today, nearly every major technology manufacturer has some form of high-speed MPP hardware and software, providing nearly limitless capability to collect massive amounts of data and provide the results in the desired timeline.
Q: Are current risk management information systems capable of handling this mountain of data?
A: Yes. Today, companies are actively pursuing big data projects; however, best practices reveal that working with a vendor that is knowledgeable about calculation engines like actuarial processing combined with a vision for back-end master data management or big data repositories are a carrier's best bet for getting it done right and on the first try. Otherwise, carriers working with service providers who are not anchored to a calculation engine's capabilities or results can lead to misrepresentation of the data ? especially at risk aggregation levels where the chief actuary is depending on the reliability of the underlying data.
Q: If not, what are carriers and their service providers doing to prepare for the era of massive data?
A: Service providers typically have frameworks that can perform the horizontal ETL (extraction, transformation and load), however they have little experience in the verticalization and aggregation of information. It is critical to have both in order to be successful.
Q: Among carriers, are personal lines carriers ahead or behind commercial lines carriers in preparing for this new age?
A: Personal lines, life and annuity, probably have the lead on managing "big data" because they have the greatest need for legacy information to help predict future risk. Commercial lines (such as mid-market) also require historical data in order to quickly generate new proposals; however, the real need in that industry is flexibility in product configuration on the fly ? not necessarily "big data" analytics.
Q: Among commercial lines carriers, which are most likely to
A: The megacarriers that have multinational, multiconglomerate lines of business need to have a strong grip on risk management across operational, credit, insurance and market measures.
Q: Does good management of data equate to better management of enterprise risks?
A: Yes, indeed. Naturally, if a carrier implements a comprehensive master data management program, they have a much better handle on the quality of their information, that's the critical aspect. While people remain a company's greatest asset, the "big data" discipline transcends one's position within a carrier by handing down a concrete legacy of data and calculations processes that provide the long tail of maintaining enterprise risks generation after generation.
Q: Is the insurance industry in the era of big data? Or is this era still a few years away?
A: The life sector, especially complex life products such as whole life, variable universal life and annuities, require the most data for analytics and have the most profitability for carriers. The era of big data is now, and has been present in the industry for some time, especially for those who understand the art of data management. The new news here is that "big data" has taken the sophistication of massively parallel processing by presenting it in a more simplified way, in the form of a black-box appliance for turn-key implementations.
April 13, 2012
Copyright 2012© LRP Publications