In 2005, Hurricane Katrina surged into the national spotlight, earning its place as the costliest natural disaster in U.S. history, with record-breaking devastation that caused more than $80 million in insurance loss. More than anything, Katrina highlighted the major challenge of many industries to handle these types of natural disasters--both economically and operationally. Nowhere is this cause for concern more evident than in the insurance industry and its ability to assess, manage and mitigate its exposure for these types of events.
While there may be debate about whether there are more catastrophic hurricanes now than in the past, or whether climate change is to blame, one fact is clear: Demographic changes in the United States have caused risk concentrations in areas that are prime targets for record-breaking losses.
A case in point: Never have so many high-value American homes been built so close to vulnerable shorelines.
In 2004, insured coastal exposure neared $2 trillion in both Florida and New York; $740 billion in Texas; $662 billion in Massachusetts; and more than $500 billion in New Jersey. On the Gulf and Atlantic coasts, from Texas to Maine, insured coastal exposure totaled more than $7.2 trillion.
THE RISING IMPORTANCE OF RISK DATA
With the advent of new technologies--particularly around location intelligence, geographic mapping, catastrophe models and predictive analytics--insurance carriers have vastly improved their capabilities in exposure management over the years. But the losses from Katrina, as well as other natural disasters, have shown that there is still much room for improvement.
The good news is that the availability of risk data--both real-time and historical--provides insurance risk managers with historical information about unpredictable weather or natural disasters to help them better prepare for future events. It is enabling insurers to further limit and manage their liability and exposure to risks.
There is a host of new risk data information that risk managers can choose from. Some of these types of data include:
- Earthquake (fault lines/fault zones and earthquake epicenters)
- Coastline (National Oceanic and Atmospheric Administration shoreline)
- Weather (hailstorms, hurricanes, tornadoes, windstorms and lightning)
- Windpool (state windpool eligibility zones)
- Mass Movement (lava flow and sink holes)
- Fire Departments/Fire Stations
Based on a site's exposure to severe weather, natural hazards and other risk factors, these types of data can help address critical decisions in areas such as: underwriting policy analyses and rating; custom rating territory creation and maintenance; credit and risk scoring; portfolio risk aggregation and realistic disaster scenario analyses; and claims planning and preparedness.
RATE CALCULATION AND POLICY UNDERWRITING
Precise historical data allows risk managers to make accurate rating calculations, pricing and underwriting decisions based on a site's exposure to fault lines and flood plains, as well as susceptibility to hurricanes, tornados and other acts of nature. Rates can be determined or negotiated based on the frequency and severity of damage from natural disasters, perilous weather or extent of environmental hazards.
With the historical perspective provided by risk data, businesses and public entities can determine their exposure to natural disasters, severe weather, and the damages and injuries that might be caused. This vital information can aid in emergency preparedness, pinpointing the best ways to provide disaster notification, plan evacuation routes and set up relief shelters.
An unpredictable event such as the wildfires in Southern California last year is just one example of how historical data can help insurers limit and manage their liability and exposure to risks. Today's location intelligence technology helps determine insurance rates based on the frequency and severity of damage from natural disasters, perilous weather or extent of environmental hazards.
IN WITH A BANG, OUT WITH A WHIMPER?
As for the impact of natural disasters, the year 2005 stands as the worst year for insurance losses. So the question looms: Was Katrina just an aberration or was it a harbringer of what is to come? The following two years--2006 and 2007--were relatively quiet. And the industry breathed a quiet sigh of relief.
More importantly, will the industry continue to take a long-term view of catastrophe management and continue to follow the fundamentals of risk-based pricing and underwriting integrity? The jury is out on that one.
In a recently released research report from TowerGroup , a leading industry analyst, the author stated their belief that, in order for insurance carriers to be successful, they must be able to manage risk through the use of such leading technologies as predictive modeling, geo-location and risk mapping. The authors argued that, while it is "absolutely critical that catastrophe modeling be in place at all carriers, a percent of insurers today still lack this capability or do not apply it on an enterprise level."
In either case, the technology exists to help carriers better manage their risk exposure in catastrophe-prone areas, as well as other parts of the organization. Taking advantage of that technology--and the risk data that supports it--can only help those carriers that want to leverage it into a competitive, leadership position.
BILL SINN is the strategic industry manager, insurance, for Pitney Bowes MapInfo and Group 1 Software.
April 1, 2008
Copyright 2008© LRP Publications