It is a warm, humid spring day in Dallas/Fort Worth when strong thunderstorms begin to develop alongside a high-altitude weather system that includes strong winds and convective energy coming from the Rocky Mountains.
By mid-afternoon, the atmosphere reaches a tipping point. A massive supercell thunderstorm along the weather front produces large, damaging hail and what is later designated as an EF5 tornado, with winds in excess of 200 mph.
The most recent tornado of this size as designated by the National Weather Service was on May 20, 2013, when an EF5 struck Moore, Okla., killing 24 people, flattening neighborhoods and schools, and injuring more than 350 people.
This Texas tornado is much, much worse.
Video: An EF5 tornado in May 2013 flattened much of Moore, Okla.
Moving in the usual southwest to northeast direction, it creates a damage path about 1 mile wide and nearly 200 miles long, and directly strikes the Comanche Peak Nuclear Power Plant in Glen Rose, Texas, about 40 miles west of Fort Worth and 60 miles west of Dallas.
The power plant’s reactor was built to withstand winds up to 300 mph, but it can’t withstand what happens after the tornado throws around multiple gas-filled tanker trucks, which explode and kill numerous workers.
Debris fills the air as the powerful winds destroy much of the plant’s emergency equipment, making it impossible to maintain proper conditions and temperature within the reactor. The remaining power plant workers feverishly try to manually shut down the nuclear reactor before it melts down. They can’t.
When the reactor’s heat exceeds the ability of the plant’s processes to cool it down, radioactive gases begin to snake their way into the funnel stacks. The radioactive contamination is carried by the ferocious winds directly toward Dallas/Fort Worth.
Communication fails as area power lines go down, so it is difficult to warn the 7 million residents of the Metroplex, as Dallas/Fort Worth is known. Residents know the tornado has been sighted and try to prepare, but they don’t know that deadly airborne toxins are being carried toward them.
About 10,000 homes and 700 commercial structures in the direct path of the tornado are completely destroyed and another 35,000 suffer damage, according to a model built by RMS. Roofs are ripped off apartment houses and multi-family dwellings. Vehicles are tossed around like toys, and with the storm striking at rush hour, workers on the roads are exposed to flying debris and high winds.
Even with residents sheltering in basements and safe rooms, fatalities reach into the 500-700 range — putting this event in line to be the deadliest tornado in U.S. history, after the Tri-State tornado of 1925, which killed 695 people in Missouri, Illinois and Indiana.
But it is the unseen radioactive contamination that ultimately makes the deadliest mark on the area.
Immediate fatalities from radiation poisoning number about two dozen, but as the contaminated rainfall seeps into the ground soil and water supply, the long-term health of the residents — and their descendants — is jeopardized. So, too, are the cattle and other agricultural products of Texas, which leads the nation in the number of ranches and farms it holds.
Chernobyl and Fukushima are the only events of a similar nature, even though the United States has seen its own recent near misses.
The radioactivity causes large swaths of area to be cordoned off, making it difficult to repair transmission and power lines as well as homes and businesses.
“The hard truth is that many businesses will close and many people will move from the area,” said Todd Macumber, president of international risk services, Hub International.
Chernobyl and Fukushima are the only events of a similar nature, even though the United States has seen its own recent near misses.
In 2011, a tornado knocked out power to the Browns Ferry Nuclear Power Plant near Huntsville, Ala., requiring the shut down of its three reactors. The plant fired up backup diesel generators until power was restored. The storm also disabled the plant’s sirens, which are needed to warn nearby residents in a crisis.
That same year, a tornado barely missed damaging 2.5 million pounds of radioactive waste at the Surrey Power Station in southeastern Virginia, although it touched down in the plant’s electrical switchyard and disabled power to the cooling pumps. The operators needed to activate backup diesel generators to run the two reactors until power was restored.
Twenty-eight years after the radioactive disaster at Chernobyl in 1986, some parts of the Ukraine remain a toxic wasteland. And in Japan, an initial evacuation area of about 2 miles surrounding the Fukushima Daiichi Nuclear Power Plant was soon widened to about 12.5 miles.
Now, three years after three of Fukushima’s six reactors melted down, the area is still unlivable and 40 miles away, diagnoses in children of thyroid cancer, which is caused by radiation poisoning, are skyrocketing, according to some reports.
Nearly 16,000 people died in the 2011 earthquake and tsunami that struck Japan, causing the meltdown. About 160,000 people were evacuated, 130,000 buildings were destroyed and $210 billion in damage was sustained.
The Texas scenario has a lot of variables, said Matthew Nielsen, director of Americas product management at RMS, who created the model for our Comanche Peak Nuclear Power Plant black swan scenario.
The likelihood of a tornado, with thunderstorms and hail, causing massive structural damage is about 1 in 200 years, he said. Such an event would result in at least $20 billion in insured losses and uninsured losses of about the same amount.
But a tornado following the exact path as this scenario — striking the power plant and heading into the Dallas/Fort Worth Metroplex — has a much, much smaller chance — about 1 in 10,000 years.
“Given the fact that tornadoes are very rare, it isn’t something that I think people should be screaming and running around frantically about,” Nielsen said. “But it’s certainly something that could happen.”
As for losses due to the radiation? “There’s not a lot of historical data points that we can confidently say that that portion would be x or y billion,” he said.
Any rebuilding will be delayed by the threat posed by radioactive contamination, which may spread over a large area via the thunderstorms and storm water runoff.
From an insurance perspective, all personal and commercial lines of insurance have a nuclear energy hazard exclusion. American Nuclear Insurers (ANI) provides third-party liability insurance for all power reactors in the United States.
“We are responsible for the insurance coverage protecting the operators from claims alleging bodily injury or property damage offsite from [radioactive] materials,” said Michael Cass, vice president and general counsel at ANI, a joint underwriting association with 20 insurance company members.
The ANI was created under the Price-Anderson Act of 1957 and provides a primary policy limit of $375 million for claims due to offsite consequences from the release of radioactive materials from the 100 operating nuclear power plants in the United States. It also covers some plants that are shut down or in the process of being decommissioned, he said.
The ANI also covers costs related to emergency response and evacuation, including food, clothing and shelter, he said.
The joint underwriting association also administers an additional excess layer of about $13.2 billion, the costs of which would be borne by the power plant operators, and would be apportioned equally among them.
For any claims above $13.6 billion (which includes both the primary and excess layers), the Price-Anderson Act requires the U.S. Congress to “take steps to come up with a scheme to provide full compensation to the public and to continue claims payments,” Cass said.
“They could assess or tax the energy industry in some fashion or form. It doesn’t say that specifically, but that is what is alluded to.”
None of the insurance companies that are ANI members would be adversely affected if such a black swan event were to occur, he said.
“There would be a loss reserve recorded on their balance sheets, per participation in our pool, but we do have funds set aside for these catastrophic events where we wouldn’t be requiring any additional funds,” Cass said.
Damage to the power plant itself would be covered by Nuclear Electric Insurance Ltd., which insures electric utilities and energy companies in the United States. Current limits are $1.5 billion per site on the primary program, and up to $1.5 billion per site in its excess program.
Allan Koenig, vice president, corporate communications at Energy Future Holdings, which operates Comanche Peak, said the plant is robustly protected. It has two independent systems that can provide off-site power as well as backup diesel generators, to allow the units to be safety shut down in the event of natural catastrophes.
He also noted the plant has safety shields for fuel storage casks, a 45-inch-thick steel-reinforced concrete containment building wall, and fire protection redundancies.
As for the affected businesses and homeowners, they may be left in a swirling vortex of coverage confusion. The situation would have the flavor of what happened after Superstorm Sandy, when coverage often depended on whether damage was caused by flooding or wind surge.
The question for Texas insureds would be whether the damage was caused by the tornado or by the radioactivity.
“It’s an incredibly complex question and a complex issue that is really only solvable and resolvable if and when the incident occurs,” said John Butler, vice president of the environmental practice at Hub International.
“What it boils down to is the chicken and the egg scenario,” he said. “What came first? Either event has the ability on its own to create a total loss.”
Resilience and redundancy should be the key takeaways from this, said Peter Boynton, founding co-director of the Kostas Research Institute for Homeland Security at Northeastern University in suburban Boston.
“If we can retain a percentage of the critical function of whatever system we are talking about, the difference between 0 percent and 30 percent when the bad thing happens is huge.” — Peter Boynton, founding co-director of the Kostas Research Institute for Homeland Security, Northeastern University
Instead of viewing catastrophic events from an emergency management perspective, where the discussion revolves around what was — or was not — managed well, it’s better to look at the way design can lead to “continuity of function,” he said.
When Boynton was head of emergency management for the state of Connecticut, he managed the statewide response in 2011 to Hurricane Irene, which knocked out 70 percent of the state’s electric grid, leaving residents unable to access many gas stations, ATMs and grocery stores.
If the state had designed a “resiliency approach” prior to the event, it could have built in a pre-determined amount of redundancy into the system so that, say, an additional 20 percent or 30 percent of the grid remained viable.
“If we can retain a percentage of the critical function of whatever system we are talking about, the difference between 0 percent and 30 percent when the bad thing happens is huge,” Boynton said.
In the Texas scenario, if the crisis planning included a redundancy for warning nearby residents even when the power and communication lines failed — such as by using satellites to create a minimal level of continuity — the amount of death and destruction could have been lessened.
“Otherwise, we really are setting ourselves up for an impossible discussion,” he said. “You can’t just pick up these pieces at the moment of crisis. You have to understand how system design can play a role.”
Analyzing such a black swan scenario is a useful exercise, said Justin VanOpdorp, manager, quantitative analysis, at Lockton.
“Can this actually happen? Yes. Will it? Maybe not,” he said. “I think what it does is, it helps to think through it just to be prepared for those situations when they do arise.”
Additional 2014 black swan stories:
When the 8.5 magnitude earthquake hits, sea water will devastate much of Los Angeles and San Francisco, and a million destroyed homes will create a failed mortgage and public sector revenue tsunami.
A double dose of ice storms batter the Eastern seaboard, plunging 50 million people and three million businesses into a polar vortex of darkness and desperation.
7 Emerging Technology Risks
Six Best Practices For Effective WC Management
It’s no secret that the professionals responsible for managing workers compensation programs need to be constantly vigilant.
Rising health care costs, complex state regulation, opioid-based prescription drug use and other scary trends tend to keep workers comp managers awake at night.
“Risk managers can never be comfortable because it’s the nature of the beast,” said Debbie Michel, president of Helmsman Management Services LLC, a third-party claims administrator (and a subsidiary of Liberty Mutual Insurance). “To manage comp requires a laser-like, constant focus on following best practices across the continuum.”
Michel pointed to two notable industry trends — rises in loss severity and overall medical spending — that will combine to drive comp costs higher. For example, loss severity is predicted to increase in 2014-2015, mainly due to those rising medical costs.
Debbie discusses the top workers’ comp challenge facing buyers and brokers.
The nation’s annual medical spending, for its part, is expected to grow 6.1 percent in 2014 and 6.2 percent on average from 2015 through 2022, according to the Federal Government’s Centers for Medicare and Medicaid Services. This increase is expected to be driven partially by increased medical services demand among the nation’s aging population – many of whom are baby boomers who have remained in the workplace longer.
Other emerging trends also can have a potential negative impact on comp costs. For example, the recent classification of obesity as a disease (and the corresponding rise of obesity in the U.S.) may increase both workers comp claim frequency and severity.
“The true goal here is to think about injured employees. Everyone needs to focus on helping them get well, back to work and functioning at their best. At the same time, following a best practices approach can reduce overall comp costs, and help risk managers get a much better night’s sleep.”
– Debbie Michel, President, Helmsman Management Services LLC (a subsidiary of Liberty Mutual)
“These are just some factors affecting the workers compensation loss dollar,” she added. “Risk managers, working with their TPAs and carriers, must focus on constant improvement. The good news is there are proven best practices to make it happen.”
Michel outlined some of those best practices risk managers can take to ensure they get the most value from their workers comp spending and help their employees receive the best possible medical outcomes:
1. Workplace Partnering
Risk managers should look to partner with workplace wellness/health programs. While typically managed by different departments, there is an obvious need for risk management and health and wellness programs to be aligned in understanding workforce demographics, health patterns and other claim red flags. These are the factors that often drive claims or impede recovery.
“A workforce might have a higher percentage of smokers or diabetics than the norm, something you can learn from health and wellness programs. Comp managers can collaborate with health and wellness programs to help mitigate the potential impact,” Michel said, adding that there needs to be a direct line between the workers compensation goals and overall employee health and wellness goals.
Debbie discusses the second biggest challenge facing buyers and brokers.
2. Financing Alternatives
Risk managers must constantly re-evaluate how they finance workers compensation insurance programs. For example, there could be an opportunity to reduce costs by moving to higher retention or deductible levels, or creating a captive. Taking on a larger financial, more direct stake in a workers comp program can drive positive changes in safety and related areas.
“We saw this trend grow in 2012-2013 during comp rate increases,” Michel said. “When you have something to lose, you naturally are more focused on safety and other pre-loss issues.”
3. TPA Training, Tenure and Resources
Businesses need to look for a tailored relationship with their TPA or carrier, where they work together to identify and build positive, strategic workers compensation programs. Also, they must exercise due diligence when choosing a TPA by taking a hard look at its training, experience and tools, which ultimately drive program performance.
For instance, Michel said, does the TPA hold regular monthly or quarterly meetings with clients and brokers to gauge progress or address issues? Or, does the TPA help create specific initiatives in a quest to take the workers compensation program to a higher level?
4. Analytics to Drive Positive Outcomes, Lower Loss Costs
Michel explained that best practices for an effective comp claims management process involve taking advantage of today’s powerful analytics tools, especially sophisticated predictive modeling. When woven into an overall claims management strategy, analytics can pinpoint where to focus resources on a high-cost claim, or they can capture the best data to be used for future safety and accident prevention efforts.
“Big data and advanced analytics drive a better understanding of the claims process to bring down the total cost of risk,” Michel added.
5. Provider Network Reach, Collaboration
Risk managers must pay close attention to provider networks and specifically work with outcome-based networks – in those states that allow employers to direct the care of injured workers. Such providers understand workers compensation and how to achieve optimal outcomes.
Risk managers should also understand if and how the TPA interacts with treating physicians. For example, Helmsman offers a peer-to-peer process with its 10 regional medical directors (one in each claims office). While the medical directors work closely with claims case professionals, they also interact directly, “peer-to-peer,” with treatment providers to create effective care paths or considerations.
“We have seen a lot of value here for our clients,” Michel said. “It’s a true differentiator.”
6. Strategic Outlook
Most of all, Michel said, it’s important for risk managers, brokers and TPAs to think strategically – from pre-loss and prevention to a claims process that delivers the best possible outcome for injured workers.
Debbie explains the value of working with Helmsman Management Services.
Helmsman, which provides claims management, managed care and risk control solutions for businesses with 50 employees or more, offers clients what it calls the Account Management Stewardship Program. The program coordinates the “right” resources within an organization and brings together all critical players – risk manager, safety and claims professionals, broker, account manager, etc. The program also frequently utilizes subject matter experts (pharma, networks, nurses, etc.) to help increase knowledge levels for risk and safety managers.
“The true goal here is to think about injured employees,” Michel said. “Everyone needs to focus on helping them get well, back to work and functioning at their best.
“At the same time, following a best practices approach can reduce overall comp costs, and help risk managers get a much better night’s sleep,” she said.
To learn more about how a third-party administrator like Helmsman Management Services LLC (a subsidiary of Liberty Mutual) can help manage your workers compensation costs, contact your broker.
Debbie discusses how Helmsman drives outcomes for risk managers.
Debbie explains how to manage medical outcomes.
Debbie discusses considerations when selecting a TPA.
This article was produced by the R&I Brand Studio, a unit of the advertising department of Risk & Insurance, in collaboration with Helmsman Management Services. The editorial staff of Risk & Insurance had no role in its preparation.