By BILL HARTNETT, director, insurance solutions, U.S. Financial Services Group, Microsoft Corp.
Insurance companies sell a very unique product. A promise. A promise that if disaster strikes, financial support will be in place to help victims put their lives back together.
But insurers can't make this promise based on altruism. The key concepts behind insurance--the law of large numbers and spread of risk--allow insurers to charge all customers a small known amount, the premium, and when disaster does strike, provide financial and other resources when they are most sorely needed to those affected.
While this sounds pretty simple on its face, the underlying mechanisms are very complex. How likely is it that a disaster or an accident will occur? What are the odds that certain policyholders will be affected by events like hurricanes, earthquakes or floods? What is the maximum level of risk an insurer should assume in a given geographic area? All of these decisions are based on extremely complex algorithms that are in fact the true competitive differentiation or the "secret sauce" behind all successful insurers.
Insurance companies deal with questions like this daily; in essence, trying to predict the future by what they know about the past.
In a perfect world, insurers would be able to exactly predict how often disasters will strike, how bad they will be, effortlessly dispatch resources to help victims, and at the same time generate sufficient revenue for consistent return on investment for their shareholders. Policyholders would be able to find affordable insurance no matter where they lived and would feel completely secure. But the real world is not perfect, and the art and science of insurance underwriting can--and does--easily go awry.
In some cases, insurance companies may underestimate the amount of risk involved in insuring certain regions, or even specific policyholders, and find themselves digging deeply into reserves to pay what's needed to help victims and maintain enough cash on hand to cover operating expenses.
The worst thing that can happen both for insurers and for policyholders is that an insurer is unable to fulfill the promise they've made. They may have made bad actuarial assumptions, misjudging the frequency or severity of natural disasters, or, when they strike, been caught unprepared to respond properly as happened with Hurricane Katrina in August 2005. While very infrequent, insurance companies do go bankrupt and while state insolvency funds are there as a backstop, the promise has still been broken.
But with the increasing availability of high performance computing (HPC) solutions, insurers are able to more accurately and deeply assess many risk scenarios benefitting both insurers and their policyholders. Since insurance companies base their business model on the ability to collect, correlate and analyze data, having access to the best data is crucial, but not enough. You must also be able to analyze it to make pricing and other decisions.
Many insurers already use high powered software and computers to run complex calculations involved in risk assessment. These applications are most often run on "super computers" that are extremely expensive to acquire, program and maintain. Software engineers responsible for these systems spend their time developing the intricate code that is required to divide computational jobs across the many processors--or in some cases groups or clusters of machines--that give these solutions their enormous power. Very little time is spent on perfecting what makes each insurance company unique--its underwriting algorithms.
But over the last 30 years, a revolution has been in progress--the relentless commoditization of technology. Computing power that used to be available to only the richest companies is now essentially free. Software firms like Microsoft are taking advantage of this revolution to deliver faster, more affordable and agile software solutions that can make any task far easier and cheaper, especially those that require large amount of computational power. These capabilities have been around a long time, but they've been exorbitantly expensive. What we're doing at Microsoft is making this available at commodity prices--HPC for the masses.
With affordable computing power available to anyone, insurers can now devote more resources to building and refining their risk analysis techniques. Actuarial models that used to take days to run and were often outsourced to cut cost can now be run in hours in-house. Insurance is an industry drowning in data. The ability to take years and years of pre and post loss data and compare assumptions with actual results allows more accurate pricing. Not long ago even the most sophisticated carriers had essentially three pricing tiers, ubstandard, standard and preferred.
No longer. The ability to get more granular analysis allows for many more pricing tiers to more accurately match premium with the actual risk.
Of course the best risk management is prevention and the benefits of HPC solutions don't stop once the policy is written. Hurricanes can be tracked in real time and strike path "war games" can be simulated to determine how much value is at risk. As a result, insurers can be proactive, allowing them to predict what resources will be needed in the aftermath of a disaster before it hits, determining in advance how to allocate building materials, move skilled laborers and adjusters into position or even help policyholders secure their property in advance to help stem losses.
While managing risk is the most visible aspect of the insurance business, many would say that managing investments is really the core competency of the industry. To fulfill the promise made through an insurance contract, insurers collect billions of dollars in premiums, and hold them in reserves. To maximize income and profit these reserves are invested until they are needed to pay claims. In fact, there have been many years when the industry takes in less in premiums than it pays out in claims. Sometimes, it can make business sense to intentionally run a combined ratio over 100 percent to generate more funds to invest. These periods are called "cash flow underwriting" and saw their peak in the 1980's.
Outside of insurance risk management, one of the most common and effective uses of HPC is in investment management. The most significant innovation on Wall Street has been quantitative analysis which would not exist without access to computing power provided by HPC. In fact the major innovation of quants was risk management applied to money in the same way insurers had applied it to property and casualty exposures. The democratization of HPC technology has made this power available to nearly any size insurer, and also to risk managers for any large corporation. If your organization has a trading operation you can and should be using HPC to manage your money.
Finally, innovative insurers are starting to use this technology to transform their wealth management offerings to end customers. Through the power of solutions leveraging HPC, agents can offer sophisticated portfolio management techniques like Monte Carlo simulations or stochastic modeling to high net worth clients or even the mass affluent. These very powerful tools have been in use for many years, but were expensive and time consuming. HPC takes applications that required super computers and overnight processing and makes them available to anyone in the organization. Models that took hours to run can now be executed in minutes transforming investment advice from a game of phone tag to a true collaborative experience. Combined with familiar tools like Excel for further analysis, HPC brings amazing power to virtually anyone who knows how to open a spreadsheet.
Another hotbed of HPC is the reinsurance industry which in essence insures the insurers. In the wake of hurricanes Andrew and Iniki in the 1980's the reinsurance industry started looking at fundamental concepts of property underwriting, like the "100 year storm" and what intensive coastal development had done to spread of risk. Their pioneering work has helped the industry better understand how to think about frequency and severity of natural disasters and how to plan for them.
HPC can help reinsurers develop assessments not only of the risks from hurricanes, earthquakes, floods or other natural disasters, but can integrate the vulnerabilities of specific properties. Many reinsurers combine the use of HPC with resources like structural engineers to determine how to keep damage from happening in the first place. In Florida, for instance, building codes have changed to require structures more resistant to high winds or water damage. This not only helps the industry to control claims cost, but more importantly it makes policyholders safer and more secure, and keeps insurance both available and affordable.
But more importantly, for insurance firms and their policyholders, the use of HPC can ensure that the industry delivers on the most important tenet of all, which means keeping their promise.
May 1, 2010
Copyright 2010© LRP Publications