The Importance of Being Human
Signs are everywhere touting bigger and better risk data, new and improved analytics, and “actionable insights.” The pace of change is exciting and many new tools show promise. There’s so much upside!
Being curious, I wondered whether there’s any downside, any risks we are not seeing yet, and asked wise risk manager friends for their take. As our industry reflects on the future needs and opportunities to be met — where we need to be in five years, I was reminded of important truths we’d be wise to keep in mind.
One immovable truth is that it’s people that make a difference. Bringing their talent, experience and insight to bear, the best risk managers will extract from and add value to new tools. As an industry based on serving and helping protect people and organizations from risks, we rely on our people as the bedrock, not analytics.
While many emerging tools provide interesting information, “actionable insights” is in the eye of the risk manager — beholder. “Old-fashioned” experience and insight still matter, and our teams need to leverage all talents. A risk to consider is potential loss of talent in our industry as new tools foster efficiency, and could lead to talent loss or reduction in engagement.
As an industry based on serving and helping protect people and organizations from risks, we rely on our people as the bedrock, not analytics.
Now, I’m one of the engaged, but I learn from listening to the cynics too! We all realize that statistics can be misleading — providing different answers to the same information depending upon the positions being promoted.
Erroneous conclusions due to inadequate data are a related problem. The statistics can be built to support a position rather than the other way around.
As an example we know well, advancements in property modeling produced different results on static properties where in fact the risks hadn’t changed. So, a risk may be that new tools may polarize parties rather than building win-win frameworks.
Where improved transparency is the upside of the information age, a friend mentioned a potential downside: commoditization. Sameness due to an over-reliance on data may reduce the effectiveness and even the relevancy of insurance.
Strong skills in customizing policy language to respond to unique concerns are another “old school” but accretive skill. Indeed, comparability and customization of information to the customer is always a concern.
Traditional benchmarking that once seemed helpful just doesn’t work that well. Risks are not homogeneous. Magnitude matters. Claim profiles, financial objectives, coverage and structures vary.
Risks aren’t static. We need a compass directed toward emerging trends. So there’s a risk in drawing conclusions from inadequate or irrelevant data. But good progress in predictive modelling, leading indicators as well as multi-variable benchmarking is moving us in the right direction.
With exceptional insight and experience, veteran risk managers are adept at understanding analytics advancements and what is relevant to decision making.
In all the talk about actionable insights, we don’t hear a lot about wisdom. That means having the experience and insight to make the best decisions using incomplete information — most of the time.
Our industry needs to leverage our collective talents to build the next-generation tools, while remembering that those tools don’t replace personal interaction. The successful marriage of personal insight and experience with expanded analytics power will characterize our industry’s top contributors now and in the future.
We Don’t Need More Data, We Need Smart Analytics
Prior to the late 20th century, very little data was captured. And large data sets, even if they existed, required an incredible amount of manual tabulating.
The problem was solved by creating IT systems to capture and store huge amounts of data.Today, a $50 heart rate monitor or fitness watch captures 250 data points per second — 21.6 million a day.
So now with all of this data we can answer all of life’s questions, right?
Not even close.
Having petaflops of data is useless by itself no matter how organized it is.
Simply having data isn’t necessarily helpful. If it is housed on five different servers in three different formats, all jumbled up, it is worse than useless Having a million times more useless data simply means that making sense of it is a million times harder and more expensive.
The typical solution to this challenge comes in two parts.
First, by saying “big data” over and over and blindly storing more and more data, many hope that God will magically fix the problem and send them to analytics heaven. (Cue the sound of angelic choir singing.)
Second, to hedge their bets, many have been pouring money into organizing the data via “data dictionaries” and better-organized databases.This is a tragically wrong-headed approach.
Having petaflops of data is useless by itself no matter how organized it is.The Library of Congress has an unfathomable amount of information on its shelves, all of which is well organized and catalogued. But the Dewey Decimal System is not analysis.
In order to make sense of data, you must have a coherent approach to analyzing it. Plunking more and more data into Hadoop does not answer any questions other than “how can we spend ourselves into bankruptcy with Big Data projects?”
So, what about those magical algorithms?
By themselves, algorithms are not the answer.They are simply computer code that executes logic. But first you need to know what you want the algorithm to look for.
There is no magic software bullet that will go through data and answer questions. No program, no matter how expensive, is going to provide a shortcut to make sense out of data.
As for Excel spreadsheets, they are equivalent to a dinosaur trying to drive a car.
To get out of this big data dead end, we need a different approach.
The real goal is to answer key business questions. To make sense of data, we need the analytical skills that can spot patterns and provide insight into the future. These are, to name but a few: Bayesian statistics, System dynamics models, Network Theory, Complexity Theory, econometrics and advanced statistics.
Abraham Maslow said, “When all you have is a hammer, every problem begins to look like a nail.”
Sadly, we have been trying to use a hammer to cut boards, drill holes and paint the walls. It is time we went to Home Depot and approached the task with the right tools.
Specialty Drugs Show No Signs of Slowing Down
A decade ago, high-cost specialty drugs were commonly referred to as “injectable drugs” and were used to treat conditions not typically covered in workers’ compensation, such as cancer, rheumatoid arthritis and multiple sclerosis.
“Today, however, new specialty drugs are emerging that will be used to treat other chronic and inflammatory conditions,” said Joe Boures, president and CEO of Healthcare Solutions, an Optum company providing specialized pharmacy benefit management services to the workers’ compensation market.
“Payers in the workers’ comp market are just beginning to feel the cost impact of greater utilization of these drugs, which come with expensive price tags.”
Specialty drugs are often manufactured using biologic rather than chemical methods, and they are no longer just administered by injections. New specialty drugs can also be inhaled or taken orally, likely contributing to the rise in their utilization.
“There isn’t a standard definition of specialty drugs, but they are generally defined as being complex to manufacture, costly, require specialty handling and distribution, and they difficult for patients to take without ongoing clinical support or may require administration by a health care provider,” said Boures.
In 2014, more than a quarter of all new therapies that the FDA approved were through its biologics division. Biologics, and similar therapies, are representative of a future trend in prescription drug spend.
“As the fastest growing costs in health care today, specialty drugs have the potential to change the way prescription benefits are provided in the future,” said Jim Andrews, executive vice president of pharmacy for Healthcare Solutions.
Workers’ Compensation payers may not recognize how specialty drugs are affecting their drug spend.
Specialty drugs like Enbrel®, Humira® and Synvisc® can be processed in conjunction with other medical procedures and, therefore, not recognized by payers as a pharmacy expense.
This leaves payers with little visibility into the costs of these medications within their book of business and a lack of tools to control these costs.
Due to the high costs of specialty medications, special due diligence should be utilized when claimants receive these medications, up to and including utilization review, said Andrews.
“Healthcare Solutions recommends that claimants using specialty drugs are monitored for proper medication handling and that the medication is administered appropriately, as well as monitoring the claimant to determine whether the medication is having its desired results and if there are any side effects,” he said.
“At $1,000 per pill for some of these specialty medications, making sure a claimant can tolerate the side effects becomes vital to making sure the claimant achieves the desired outcomes.”
Hepatitis C drugs have made their way to the workers’ compensation market, largely through coverage of healthcare workers, who have exposure to the disease.
“Traditional drug treatments that began in the 1990’s had a success rate of 6% and costs ranging from $1,800 to over $88,000,” said Andrews.
“The new Hepatitis C specialty medications have a treatment success rate of 94-100%, but cost between $90,000 and $226,000.”
Although the new treatments include higher drug costs, the payer’s overall medical costs may actually decrease if the Hep C patient would have required a liver transplant as part of the course of treatment without the drugs.
While the release of new Hepatitis C medications in 2014 demonstrated the potential impact specialty medications can have on workers’ compensation payers, there are some specialty medications under development that target more common conditions in workers’ compensation.
Pfizer Inc. and Eli Lilly and Company are currently developing tanezumab, a new, non-narcotic medication to treat chronic pain, which is common in workers’ compensation claims.
Tanezumab has demonstrated benefits of reducing pain in clinical trials and may provide non-addictive pain relief to claimants in the future. This may change how pain management is treated in the future.
Healthcare Solutions has a specialty medication program that provides payers discounted rates and management oversight of claimants receiving specialty medications.
Through the paper bill process, Healthcare Solutions aids payers in identifying specialty drugs and works with adjusters and physicians to move claimants into the specialty network.
A central feature of the program is that claimants are assigned to a clinical pharmacist or a registered nurse with specialty pharmacy training for consistent care with one-on-one consultations and ongoing case management.
The program provides patients with education and counseling, guidance on symptoms related to their medical conditions and drug side effects, proactive intervention for medication non-adherence, and prospective refill reminder and follow-up calls.
“The goal is to improve patient outcomes and reduce total costs of care,” said Boures.