Rising Threats in the Public Sector
In Orange, Texas, a March public meeting to review storm surge suppression options for the Gulf Coast Community Protection and Recovery District had to be rescheduled to April 14 because of — wait for it — widespread flooding.
Irony aside, nearly 140 miles of the Sabine River flooded for four days, closing down almost all major roads that crossed this watery Texas-Louisiana border.
The flood and delayed meeting are a clear example about the very real impact climate change is having on low-lying areas all around the U.S. coasts. The Texas Governor’s Commission for Disaster Recovery and Renewal program estimates that it may cost as much as $11 billion to protect that area of the Gulf Coast.
“Nobody is sticking their head in the ground as far as I can see.” — William F. Becker, national public sector practice leader, Aon Risk Solutions
That’s a good investment, however, since Hurricane Ike in 2008 wrought $29 billion in property damage, making it the most expensive storm in the state’s history and the third costliest in the country, according to the Texas Engineering Extension Service report.
Texas officials are not alone in facing such challenges.
Coastal communities such as New York City, Norfolk, Va., Miami-Dade County, and Seattle lie at or below sea level, making them vulnerable to storm surges and flooding, especially in the face of the rising sea levels and increased storm activity predicted for the coming decades.
The key question for the public sector is how to affordably protect both residential and business properties as well as public infrastructure such as roads, bridges, utilities, water treatment plants, and other assets.
For many communities, the answer is to do more accurate risk modeling; do short- and long-term planning; build traditional “gray” defenses such as dams, levees, walls, and sea gates as well as natural “green” defenses to reduce storm surge impact; and work with insurers and reinsurers to rebound from losses made inevitable by climate change.
“Nobody is sticking their head in the ground as far as I can see,” said William F. Becker, Aon Risk Solutions’ national public sector practice leader. Instead, communities are balancing what can be done over the next few decades with current engineering and technology, while keeping an eye out for what may be possible with new technology by the year 2100.
However, budgetary pressures make many large-scale projects unaffordable, so public entities are maintaining the current infrastructure as best they can for now, said Becker, whose company also helps to administer the National Flood Insurance Program (NFIP).
Some cities have begun implementing protective strategies with a lower price tag, such as prohibiting land clearing near flood-prone areas, planning for public green spaces to absorb water, elevating streets over time, providing dunes on the coast, and heightening sea walls, among other solutions.
Tackling the problem with small, affordable strategies will give public sector organizations more time “until new and highly technical strategies can save these critical vital communities going forward,” Becker said.
Others are looking to partnerships to broaden their access to solutions.
One example is Miami-Dade County, which launched an innovative program in partnership with The Nature Conservancy (TNC), catastrophe modeler Risk Management Solutions (RMS), engineering company CH2M, and the American Red Cross/Red Crescent’s Global Disaster Preparedness Center.
The collaborators will work on various aspects of two demonstration projects that will measure the effects of green defenses such as salt marshes and mangrove forests in protecting the region from the severe storms that arise in the Atlantic’s Hurricane Alley.
This is the latest research project that RMS undertook to help model the protective impact of biological defenses. Working with communities on both coasts — including Norfolk, Va., and the Seattle region’s Puget Sound — “RMS has begun showing how we can quantify the reduction in risk using our storm surge modeling capability for coastal risks,” said the modeler’s Chief Research Officer Robert Muir-Wood.
“Our [modeling looks at] the full sweep of potential hurricanes and the storm surges they generate, and takes it all the way through to the damage and the quantified loss to property, which may be inland of the coastal areas,” Muir-Wood explained.
“We can actually quantify the benefits of one form of coastal defense [marshes]. That opens up a much bigger conversation to not only other classes of biological defenses but also to think through how can you combine a mixture of biological and gray defenses to provide really good protection.”
This kind of assessment is critical for communities to understand the complete impact of storms on local government budgets, noted Kathy Baughman McLeod, managing director of TNC’s coastal risk and investment.
“We have found that local governments and governments in general don’t know what their risk is,” said McLeod. “And then, when they make [storm-related] repairs, they pay for it out of all different buckets [such as public works, parks and recreation, etc.].
“When we ask, ‘What are you spending each year?’ they don’t know. They have the awareness but the [quantitative analysis] is not there.”
Impact on Rates
TNC’s project will help develop trusted metrics about both green and gray defenses that can eventually be adopted to set more accurate NFIP prices and quantify total risk. In addition, on April 1, the Federal Emergency Management Agency (FEMA) implemented more rigorous guidelines for accurately assessing flood risk to help set more appropriate insurance rates.
While property owners and local governments in some communities will pay higher rates that are more commensurate with the actual risk, communities that adopt better defense strategies will see their rates decrease.
While climate change may have the greatest impact on coastal communities, the interior of the U.S. will face changing weather patterns as well.
For example, New Orleans got good news from FEMA in April, nearly 11 years after Hurricane Katrina devastated the city. With about $14.6 billion in improvements made by the U.S. Army Corps of Engineers, the below-sea-level area now has a new defensive ring of protective levees, floodwalls, and floodgates.
Work is continuing on renovations to the city’s drainage system as well. For many residents and businesses, the new maps mean lower insurance rates.
Muir-Wood said expect to see more municipalities and other public organizations create the position of chief resilience officer, just as Miami-Dade County, Norfolk, San Francisco, New Orleans and other cities have done with the support of the Rockefeller Foundation as part of its 100 Resilient Cities project.
This new office will coordinate work across departments and jurisdictions as well as with outside organizations to plan for and respond to weather-related events as well as other threats, such as fire, tornadoes or civil unrest.
Another trend, he said, is that catastrophe modeling tools previously used solely for insurance risk assessment will become more commonplace for big cities, letting them conduct a cost-benefit analysis for various alternative actions to reduce that risk.
Finally, understanding weather-related exposures is still going to be critical for public sector entities, said Joe Caufield, chief underwriting officer for OneBeacon’s government risk operation.
“Some information is presented in a highly dramatic fashion,” he said. “The feedback we hear is that risk managers and city managers are overwhelmed by [threats] and don’t see them as actionable … but we don’t encounter too many climate change deniers.”
Instead, public sector organizations are studying their exposure for critical assets such as wastewater management facilities and 911 communications centers. They’re making plans for protection and upgrading, especially if the facilities are right on the edge of FEMA flood map outlines that may not have been updated in 40 years.
One thing to remember about climate change, Caufield noted, is that while it may have the greatest impact on coastal communities, they are not the only areas that will have to contend with climate threats.
The interior of the U.S. will face changing weather patterns as well, especially with greater temperature variations between severe cold and increased heat. That equals a more severe spring tornado season. &
Health, Higher Ed Most Vulnerable to Cyber Attacks
As cyber risk management comes of age, more data and better analysis are leading to new realizations. One is that health care and higher education are the most vulnerable sectors, followed closely by financial services.
Another is that the vast majority of security breaches could be forestalled using simple measures, such as ensuring all updates and patches to software are installed and tested.
However, studies are starting to show that cheap, low-tech email attacks remain stubbornly effective despite expensive, high-tech protections.
All of those ideas were advanced and detailed at a fast-moving panel discussion May 11 in New York, sponsored by brokerage Crystal & Company.
Actuarial data is still thin in cyber, but Christopher Liu, head of cyber risk in the financial institutions group at AIG, said that “institutions in health care and higher education are the most hazardous classes of insureds. That is because they have the most sensitive information and that there is high turnover. Also, they usually do not have big budgets, so security is often not well supported.”
Financial institutions, especially asset managers, are the second-most hazardous class, Liu added.
“They have the same attractive information, plus they have money.”
Mitigating that, they also tend to have better funded and supported security, and they have heavy government regulation. That both keeps them on their toes, and also means greater external surveillance. Several panel members noted that firms became aware of breaches when regulators noticed unusual activity.
“We find that we deal primarily with three areas,” said Austin Berglas, senior managing director at K2 Intelligence.
“Those are: unpatched vulnerabilities in software, misconfiguration of internal systems, and misplaced trust by employees. We get called in to handle a breach, and 99 percent of the time we find the vulnerability is unpatched.”
Berglas explained that the software companies race each other to send out new versions that often are not completely functional or secure. So they send out patches. “Windows does it every week on ‘patch Tuesday.’ But users don’t have any regular schedule or system for installing and testing patches. We find unpatched vulnerabilities dating back as far as 1999.”
“I have been to meetings of the cyber response team, and everyone in the room is introducing themselves. This is the response team. Everyone in the room has to know everyone in the room.” — John F. Mullen, managing partner, Lewis Brisbois Bisgaard & Smith
The challenge of unsecured configurations between systems was dramatically demonstrated with the infamous attack on retailer Target, which came through the air-conditioning vendor. But Berglas emphasized the persistent and pernicious problem of simple phishing.
“It is estimated that 30 percent of individuals within a company will open an email, and 13 percent will click on an attachment, even if they have been warned not to,” Berglas warned.
“You spent half a billion dollars on security systems and firewalls, and one click on one phishing email by someone with elevated system privileges, and the bad guys have just defeated your half-billion-dollar defense. Now they are inside, with credentials, and you can’t detect them.”
The quickest and easiest thing that any company can do, “is to look for unpatched vulnerabilities in public-facing systems,” Berglas urged.
On the same theme, John F. Mullen, managing partner of the law firm Lewis Brisbois Bisgaard & Smith, stressed that “security goes way beyond IT.
“This is not just about the tech guys. Cyber security tends to get pushed downhill.” And that tends to mean lack of coordination on all fronts.
“I have been to meetings of the cyber response team, and everyone in the room is introducing themselves. This is the response team. Everyone in the room has to know everyone in the room.”
Similarly, “insureds have to know the coverage that they have bought. Is there a mandated forensics group? Outside counsel? If so, go meet with them. If you have options, vet them,” Mullen exhorted.
“You spent half a billion dollars on security systems and firewalls, and one click on one phishing e-mail by someone with elevated system privileges, and the bad guys have just defeated your half-billion-dollar defense.” — Austin Berglas, senior managing director, K2 Intelligence
He expects the cyber insurance business to triple or quadruple in the next five years, in terms of premium spending.
Cycling back to the theme of internal responsibility, Paul Miskovich, senior vice president and global practice leader of cyber and technology errors and omissions coverage at Axis, said that 67 percent of cyber claims presented to his firm involved insider activity of some kind: clicking on a phishing email or failing to install a patch or use a firewall. Further, 25 percent of claims involved third parties such as vendors.
For all the focus on the breach itself, Miskovich added that “regulatory costs can be more than the costs of the breach, especially if you don’t have documentation of your security policies and protocols.” That includes documentation that the policies are in place and are rehearsed.
Noting previous comments that many losses are traced to breaches that have gone undetected for years, Miskovich said that a new area within cyber insurance is full coverage for prior acts.
Commercial Auto Warning: Emerging Frequency and Severity Trends Threaten Policyholders
The slow but steady climb out of the Great Recession means businesses can finally transition out of survival mode and set their sights on growth and expansion.
The construction, retail and energy sectors in particular are enjoying an influx of business — but getting back on their feet doesn’t come free of challenges.
Increasingly, expensive commercial auto losses hamper the upward trend. From 2012 to 2015, auto loss costs increased a cumulative 20 percent, according to the Insurance Services Office.
“Since the recession ended, commercial auto losses have challenged businesses trying to grow,” said David Blessing, SVP and Chief Underwriting Officer for National Insurance Casualty at Liberty Mutual Insurance. “As the economy improves and businesses expand, it means there are more vehicles on the road covering more miles. That is pushing up the frequency of auto accidents.”
For companies with transportation exposure, costly auto losses can hinder continued growth. Buyers who partner closely with their insurance brokers and carriers to understand these risks – and the consultative support and tools available to manage them – are better positioned to protect their employees, fleets, and businesses.
Liberty Mutual’s David Blessing discusses key challenges in the commercial auto market.
“Since the recession ended, commercial auto losses have challenged businesses trying to grow. As the economy improves and businesses expand, it means there are more vehicles on the road covering more miles. That is pushing up the frequency of auto accidents.”
–David Blessing, SVP and Chief Underwriting Officer for National Insurance Casualty, Liberty Mutual Insurance
More Accidents, More Dollars
Rising claims costs typically stem from either increased frequency or severity — but in the case of commercial auto, it’s both. This presents risk managers with the unique challenge of blunting a double-edged sword.
Cumulative miles driven in February, 2016, were up 5.6 percent compared to February, 2015, Blessing said. Unfortunately, inexperienced drivers are at the helm for a good portion of those miles.
A severe shortage of experienced commercial drivers — nearing 50,000 by the end of 2015, according to the American Trucking Association — means a limited pool to choose from. Drivers completing unfamiliar routes or lacking practice behind the wheel translate into more accidents, but companies facing intense competition for experienced drivers with good driving records may be tempted to let risk management best practices slip, like proper driver screening and training.
Distracted driving, whether it’s as a result of using a phone, eating, or reading directions, is another factor contributing to the number of accidents on the road. Recent findings from the National Safety Council indicate that as much as 27% of crashes involved drivers talking or texting on cell phones.
The factors driving increased frequency in the commercial auto market.
In addition to increased frequency, a variety of other factors are driving up claim severity, resulting in higher payments for both bodily injury and property damage.
Treating those injured in a commercial auto accident is more expensive than ever as medical costs rise at a faster rate than the overall Consumer Price Index.
“Medical inflation continues to go up by about three percent, whereas the core CPI is closer to two percent,” Blessing said.
Changing physical medicine fee schedules in some states also drive up commercial auto claim costs. California, for example, increased the cost of physical medicine by 38 percent over the past two years and will increase it by a total of 64 percent by the end of 2017.
And then there is the cost of repairing and replacing damaged vehicles.
“There are a lot of new vehicles on the road, and those cost more to repair and replace,” Blessing said. “In the last few years, heavy truck sales have increased at double digit rates — 15 percent in 2014, followed by an additional 11 percent in 2015.”
The impact is seen in the industry-wide combined ratio for commercial auto coverage, which per Conning, increased from 103 in 2014 to 105 for 2015, and is forecast to grow to nearly 110 by 2018.
None of these trends show signs of slowing or reversing, especially as the advent of driverless technology introduces its own risks and makes new vehicles all the more valuable. Now is the time to reign in auto exposure, before the cost of claims balloons even further.
The factors driving up commercial auto claims severity.
Data Opens Window to Driver Behavior
To better manage the total cost of commercial auto insurance, Blessing believes risk management should focus on the driver, not just the vehicle. In this journey, fleet telematics data plays a key role, unlocking insight on the driver behavior that contributes to accidents.
“Roughly half of large fleets have telematics built into their trucks,” Blessing said. “Traditionally, they are used to improve business performance by managing maintenance and routing to better control fuel costs. But we see opportunity there to improve driver performance, and so do risk managers.”
Liberty Mutual’s Managing Vital Driver Performance tool helps clients parse through data provided by telematics vendors and apply it toward cultivating safer driving habits.
“Risk managers can get overwhelmed with all of the data coming out of telematics. They may not know how to set the right parameters, or they get too many alerts from the provider,” Blessing said.
“We can help take that data and turn it into a concrete plan of action the customer can use to build a better risk management program by monitoring driver behavior, identifying the root causes of poor driving performance and developing training and other approaches to improve performance.”
Actions risk managers can take to better manage commercial auto frequency and severity trends.
Rather than focusing on the vehicle, the Managing Vital Driver Performance tool focuses on the driver, looking for indicators of aggressive driving that may lead to accidents, such as speeding, sharp turns and hard or sudden braking.
The tool helps a risk manager see if drivers consistently exhibit any of these behaviors, and take actions to improve driving performance before an accident happens. Liberty’s risk control consultants can also interview drivers to drill deeper into the data and find out what causes those behaviors in the first place.
Sometimes patterns of unsafe driving reveal issues at the management level.
“Our behavior-based program is also for supervisors and managers, not just drivers,” Blessing said. “This is where we help them set the tone and expectations with their drivers.”
For example, if data analysis and interviews reveal that fatigue factors into poor driving performance, management can identify ways to address that fatigue, including changing assigned work levels and requirements. Are drivers expected to make too many deliveries in a single shift, or are they required to interact with dispatch while driving?
“Management support of safety is so important, and work levels and expectations should be realistic,” Blessing said.
A Consultative Approach
In addition to its Managing Vital Driver Performance tool, Liberty’s team of risk control consultants helps commercial auto policyholders establish screening criteria for new drivers, creating a “driver scorecard” to reflect a potential new hire’s driving record, any Motor Vehicle Reports, years of experience, and familiarity with the type of vehicle that a company uses.
“Our whole approach is consultative,” Blessing said. “We probe and listen and try to understand a client’s strengths and challenges, and then make recommendations to help them establish the best practices they need.”
“With our approach and tools, we do something no one else in the industry does, which is perform the root cause analysis to help prevent accidents, better protecting a commercial auto policyholder’s employees and bottom line.”
This article was produced by the R&I Brand Studio, a unit of the advertising department of Risk & Insurance, in collaboration with Liberty Mutual Insurance. The editorial staff of Risk & Insurance had no role in its preparation.