BRUCE F NORRIS and KENYON HALL, both principals at Edgewood Partners Insurance Center (EPIC), a California-based retail property, casualty and employee benefits insurance brokerage
Catastrophe models have evolved significantly in recent years and will continue to build accuracy. Knowing and accepting this fact will help industry professionals realize that catastrophe modeling is not a perfect science, but it is improving. It is impossible to forecast the next natural disaster or catastrophe, but it is within reach to understand the implications of these risks and be prepared to manage them.
The new 2009 models have been in use for several months, and it is important to review their performance in relation to industry expectations, in particular regarding the earthquake peril. From the insurance
broker perspective, evaluating individual client portfolios, the answer is yes and no.
On an aggregate basis, the professed changes to the loss estimates seem to track with actual events. However, when focusing on the 250- to 1,000-year return periods, there are numerous cases where individual portfolio losses are not correct.
Take these observations from the recent changes made to the way the models handle California earthquakes:
-- San Francisco earthquake generally had more decrease than Los Angeles.
-- Reinforced masonry had larger decreases than wood
-- Older construction did not see as much change as predicted
-- Top-end losses were not substantial and therefore the point of diminishing returns. Previously, as the intensity of an earthquake increased, the loss estimates as a related function continued to increase for structures close to the epicenter. Now, the damage remains the same no matter how hard you shake the structure.
-- If a portfolio had secondary modifiers, this appears to minimize the variance between the model versions.
-- Generally, the average annual losses (AAL) track closely to expected changes published by modelers.
The above are all results for a California earthquake, which is where the hyperfocus was on the expected decreases. It is crucial to watch other areas, such as the Pacific Northwest and Alaska, which saw major increases in the models.
Finally, each modeler uses the available information from earth scientists, geologists and the USGS, among others, differently and incorporates this data as they feel appropriate, leading to variances in loss estimates between modeling firms.
NEW MODELS AND
Lower loss estimates in California do not always translate to lower prices, at least on certain layers. DIC is a commodity. The insurance market reacts after a large earthquake or hurricane, and naturally prices go up, but this is completely independent of model results or changes.
For the property brokers, nevertheless, catastrophe models can be used to help guide decisions on limits and deductibles; set expectations for renewal in terms of pricing and availability of capacity, both inside and outside of various PML levels; and structure shared and layered property programs.
From a property-broking standpoint, proper layering and structure is one of the most important uses of CAT models. A program with improper layering and an improper DIC attachment point can cost a client hundreds of thousands in property premium. Modeling can also guide decisions on the structure of multiple towers for single clients. The general rule of thumb is that, with more separate placements comes a higher cost, but if organized properly, this is not always the case. Reviewing different options can lead to greater savings for property brokers.
Allocations are another key area in which modeling is useful. Modeling tells the story of a portfolio or a single location and the relationships within the portfolio. For many risk managers, the process of allocating property premiums and the methods used can be a subject for heated conversations. Modeling can provide key findings and substantiated evidence during this process.
Both the method of charging all California locations the same premium based on the locations total insured value, for stance, and the slightly newer version of charging a different allocation rates based on the California Department of Insurance earthquake zone, are outdated. The modern approach, with the advent of probabilistic CAT models, is to create an AAL by location and to use this during the allocation process.
This AAL approach accounts for each location's attributes (e.g., construction, year built, occupancy), its value at risk and its physical location, and then fully reviews the entire spectrum of loss scenarios.
Structures with identical attributes and values located in either highly active or rather low risk areas will have greatly different AALs. Using these AALs, we can adjust the rate by location with relative ease, while still applying a "minimum premium" factor.
NOT JUST A PML
This illustrates a key modeling lesson: Do not settle for just "getting a PML;" learn to direct the process and results.
In the insurance industry, "probable maximum loss" is the most widely used definition for the acronym PML. However, this expression does not mean the same thing to everyone.
The Uniform Building Code includes the 475-year-return-period requirement for life-safety design of a building or structure, which represents a 10 percent probability of exceedance in 50 years if 50 years is the expected life of a building. The ASTM E2026-99 recently introduced the term PL (Probable Loss), which is nearly synonymous with PML.
Another perspective pertains to banks and lending institutions, which by nature are conservative organizations. The use of a 20-year or 30-year exposure period (a PML with a 195-year or 285-year return period) appears more appropriate, given an average loan life of less than 10 years.
Risk managers and property owners, on the other hand, should not focus on a specific PML but instead on the entire loss curve with all levels of return periods--and specifically on the probabilities of exceedance. Understanding what the loss potential is at 100-year, 250-year, 475-year and even up to 10,000-year return periods is important. The risk manager or property owner can make fully informed decisions by applying this overall understanding to a company's risk tolerance, fiduciary responsibilities, the price of insurance, the cost benefits of retrofit versus risk transfer, and other criteria of this nature.
QUALITY OF DATA
This point has been covered often by the industry, but it is worth restating as bad data continues to be used in the models.
The difference in loss estimates between a wood or concrete structure in Florida, or a concrete structure in California from 1930 or 2000, can be enormous. Not having complete and accurate data (e.g., construction type, occupancy, year of construction and number of stories) should no longer be acceptable. The models have made great progress and get better every year, but with missing or incorrect data, the model quality is indifferent.
Two points often overlooked while updating or collecting data are values and addresses. Do you have, and is the modeler using, an accurate street address? This factor determines the soil type, how far a property is from the coast, what elevation the property is at. All of these can greatly impact the modeling results.
If the values are off 10 percent, the loss estimates are off 10 percent, which begs the question, "Do you have accurate replacement costs?"
Updating the values in a portfolio will often result in a higher total insurable value and a resulting increase in premiums. This increase in premium is far better to address up front rather than the alternative--being underinsured by millions of dollars when an incident actually happens.
This topic of secondary building characteristics was highly discussed a few years ago but now seems to have faded from the spotlight. A number of companies embraced the task of researching secondary building characteristics and invested in the resources needed to gather the information. These companies have reaped the benefits of this effort and will continue to do so on into the future.
For example, recently there was a placement in which the risk management department contracted with a third-party engineering firm to gather secondary characteristics. The result was a savings of nearly 1800 percent on a specific layer of the program versus the cost to collect this data. Many risk managers would consider this a good rate of return. With the type of cost-benefit in this example, it is the logical choice to invest the energy and resources to gather this data. Third-party engineering firms can accomplish this task for a few hundred dollars per location, which is an inexpensive venture for a multi-million dollar structure.
Still, a majority of companies have not incorporated this process.
Which takes us back to our main point: Manage the process, don't let it manage you.
June 1, 2010
Copyright 2010© LRP Publications