These days, the term "data standards" has become more mantra than reality in the insurance industry.
Despite the efforts of organizations like the Association for Cooperative Research and Development, which has been toiling in the insurance industry data standards vineyards for 35 years, getting all the key players on the same page when it comes to exchanging data that flows within the industry has been about as easy as finding cheap coverage in a rock-hard property and casualty market.
But maybe, just maybe, after all those years of hand-wringing over data standards, the truth is that achieving true data standards, while a worthwhile goal in some respects, isn't in and of itself deserving of the "Holy Grail" tag.
Some standards are expected and necessary, says John Lucker, a national practice principal at Deloitte & Touche. But there is a point where data standards don't always make sense.
"One major theme about data standards is, what's really in it for individual carriers to get those standards?" says Lucker, who provides property and casualty industry carriers with consulting services in the areas of data mining, predictive modeling, advanced analytics and large-scale data management.
"The biggest question is, 'If I'm carrier ABC, sure I want data standards. But if it's going to cost me millions, I also want to know, 'How I will make money with data standards? Are they going to help or hinder profitability?'"
Lucker explains that unlike the London marketplace, most U.S. companies operate as direct primary insurers, making them owners of the risk. So, data standards across the board are less critical here--especially for carriers who are already making the best use of data, in areas such as predictive modeling, for example. "We're moving away from some of the earlier data standards that ACORD focused on and moving more to agent/producer management standards," he says. "Today's data standards are more about how to make business acquisition easier for agents and producers."
To Lucker, the U.S. property and casualty industry is too far gone to all of a sudden adopt clear, clean data standards across the board.
"In a legacy-driven industry like that in the United States, and even in Bermuda, data standards are much tougher to achieve," he says.
Lucker, formerly assistant vice president at Cigna Re, where he served as controller and chief technology officer, adds that perhaps the biggest downside to total data standards is that they will completely commoditize risk. He says that if the industry is not sharing in a pool of risk, like in the London market, and a carrier is the sole provider of a product, everyone's way of doing business will be the same if data standards are in place.
"In that case," he says, "It will all come down to price, much like has happened in personal auto and term life. I don't think anyone really wants that to happen."
Lucker says that when data standards create a value to the marketplace, when data standards are matched up with all the relevant parties--buyer, agent and insurer--they will gain acceptance. "The industry is starting to get it right," he says.
Mark Miller, COO at Strategic Insurance Software, a Gahanna, Ohio, firm that provides automation service to agencies, says the flipside is that there are those who think that adhering to data standards can save money--for example, when it comes time to integrate with vendors and other third parties.
"Historically, if you look at data standards from an investment and integration perspective, they never gained much traction. They were just another cost," he says. "But now, we're getting more into the 'real time' era of IT, and agents are demanding more standardization, so agent/producer-focused data standards are gaining momentum."
Miller says larger agencies are more in tune with data standard efforts, as they are more geared towards efficiency and using agency management systems and other technologies. The smaller agents are more "workflow" oriented, and when new things come out, the adoption rate isn't as high.
"Smaller agencies are more into the 'I want to do my work the same way every time' frame of mind," he says.
He lauds ACORD for focusing its recent efforts on achieving data standards where the industry will benefit most--at the carrier-agency interaction level. Yet, he says, when it comes to data standards in the Midwest where SIS operates, there is much work ahead.
"If you take into account all the regional players, we've got quite a long way to go," he says. "On a scale of one to 10, if you take the good carriers, they are making tremendous strides and rate around a seven. But the industry as a whole is still at four or five. For carriers, new technology was not a major investment, but that's changing."
Sandi Perillo, data management business consultant at The Hartford, serves as chairman of ACORD's standards committee and is very involved in data quality as well. Perillo, a 17-year insurance veteran, has been working in ACORD data standards for the last six years, and had a hand in ACORD's XML Version 1, the initial standard for the distribution of data over the Internet.
"There definitely have been challenges in the last couple of years, mainly the need to create an ACORD XML Version 2 standard," says Perillo. "For the most part, Version 2 is about taking advantage of and supporting Web services."
Perillo says many insurers have implemented ACORD XML Version 1, and vendors are following suit.
"All the parties understand the value of standards from a business and technology perspective," she says, adding that the value of standards is around the ability to reuse components and ease of integration, both internally for claims, quoting and policy issuance, and externally for communicating with trading partners and regulatory bodies.
Perillo notes, however, that many large insurers have legacy systems in place, and that has historically slowed down data standard implementation from a cost perspective.
"Most companies don't want to replace legacy systems just to follow data standards," she explains. "But you will find data mapping, or a rewrite of some of these systems on a phased basis."
Perillo strongly believes that standards support data quality and consistency, as well as transparency.
"For the most part, we all realize that data is a huge asset to our companies, and we know that quality affects so many things throughout the company, from product development to profitability and compliance," she says. "If we know data quality is good, we can do things like provide better information for decision-making and reducing claims costs."
She stresses that data standard development is by its nature a challenge, especially for a complex industry like insurance.
"We all have different needs," she says, "but entities like ACORD try to provide a good common ground for everyone."
At ACORD, the XML effort began with policy quotes and issuance, then endorsements, renewals/reissues, and now has moved into claims services.
One vendor trying to make data standards easier to achieve is SeaPass Solutions, a New York City firm that provides "real-time, straight-through processing" to insurers via a server-based platform called SeaPass Gateway.
According to Eric Gerwirtzman, president of SeaPass' U.S. operations, SeaPass, a member of ACORD and the Agency Council for Technology, works closely with the property and casualty industry on the development of standards and applications to support those standards.
Gerwirtzman says that the underlying insurance market business--what insurance is and how business is conducted--makes data standards very tough to achieve. If you look at the differences between carrier A and carrier B, he explains, each may have their own underwriting guidelines, for one, and often that's what separates profit-making carriers from those who lost money.
"That's what makes different data requirements," he says. "There may be fundamental differences between each carrier and how they assess risk."
He explains that ACORD standards provide 80 percent common language for insurers when they interact with agents and producers. It's the remaining 20 percent that is elusive, and a tool like SeaPass Gateway, which is a server that sits on the carrier's network, will "translate" that remaining 20 percent, completing the data standards challenge.
a Philadelphia-based writer, writes frequently about technology issues.
May 1, 2005
Copyright 2005© LRP Publications