My answer is both. The two are complementary. Machines can only follow rules and cannot make decisions; they have no reasoning skills or judgment. However, underwriters need information to make good decisions. By utilizing underwriting technology, insurers can maximize the amount of time underwriters spend making decisions, and minimize the time they spend finding the information they need.
Companies have made tremendous strides in recent years in using third-party data providers to deliver better tools for underwriters to evaluate risk. For example, insurers use mapping technology to enforce underwriting rules, or to automatically verify a person's driving record. In the past, this was a time-consuming, error-prone process. Even though it was clerical in nature, it was done by underwriters most of the time.
INFO UP, TIME DOWN
The Internet and other information technology have also brought new information to the underwriter that was previously unavailable but now can help the underwriter to make better decisions. Some examples of this include storm-surge maps and EQ scores.
Additionally, Internet and Web services also make it possible to independently verify information provided. They have made this information instantly available on a very affordable per-click basis. Many of these services didn't exist even a decade ago.
Although companies have provided better tools to underwriters, most have not used these tools to their maximum advantage. This is because these tools are being used separately, which requires redundant data entry, logins to multiple systems and other duplication of effort. This is terribly inefficient, when one realizes that a company could be repeating this process thousands, if not tens of thousands, of times per month. Smart carriers will take this to the next level and integrate these third-party data streams right into their main quoting and underwriting systems. This will bring tremendous efficiencies and free up underwriters to use their time to make decisions.
With the advent of Internet standards like XML and HTTP, it has become much easier to incorporate third-party data into quoting and underwriting systems. What used to be a major programming effort has now become a week's project for a single programmer. In addition, using these third-party services reduces a system's maintenance because, unlike in-house systems, the updating and maintenance is done by the service providers instead of the in-house IT department, which is always lacking resources.
Beyond freeing up underwriters' time to use it for more value-added functions, there are other benefits to integrating these underwriting tools. For example, you can prefill information for agents, which also has been validated, to eliminate the "honest mistakes." Good examples of this are the errors made by agents in calculating the distance to coast. Not only can you ensure that the calculation is done in accordance with your guidelines, you can also save the agent some work.
Integrating these tools into the quoting process can also open up whole new opportunities to carriers that do it right. Every consumer who has shopped for insurance has had the incredibly frustrating experience of entering information for a quote, only to find out that it would not be provided instantly. The company wanted to verify the information provided. By integrating these third-party services, carriers could comfortably quote and bind in real-time. If the application included information that raised red flags, the system could automatically refer these issues to an underwriter for review and a decision.
UNDERWRITERS: MAXIMIZING RESULTS
The quality and scope of these third-party products and services will continue to improve, providing new opportunities for differentiation to early adopters. They will find better methods to price and evaluate risk. However, carriers, especially those writing large accounts, will always want submissions to pass the "smell test" before they commit their capital to insuring these risks. So underwriters won't need to look for a new career for quite some time.
With geographic information system and geocoding technology, it's much easier for underwriters to price and minimize risk, but an underwriter who understands the technology and is knowledgeable will be able maximize its results. They will use this technology to find answers on questionable risks, accurately price new business and establish sound vital underwriting guidelines.
A great example of the marriage between GIS and geocoding technologies and underwriting knowledge can be seen in how some carriers weathered Hurricane Katrina. Before Katrina struck, many insurers wrote policies that were just outside of Federal Emergency Management Agency flood zones. Unfortunately, after Katrina, underwriters learned that flood waters from hurricanes don't always stop at the edge of FEMA flood zones. As a result of this, many insurers suffered major flood losses on policies they knew were within close proximity of a flood zone, but assumed would be OK because they weren't in a flood zone.
There were some insurers that minimized their losses, though, because they leveraged address level GIS and geocoding technology to review their policies and see where they were in relation to flood zones. This step allowed underwriters to factor in the elevation of the policies in question, and, from there, they could either charge higher premiums or decline a piece of business.
Underwriters are intuitive by nature, and although there have been major strides in underwriting technology, a human element is needed to scrutinize results. What looks good on paper isn't always the case.
DAN MUNSON is founder of RiskMeter Online, an Internet application used to automate property risk reports. By simply typing in an address, users can get back natural hazard information for a given policy location.
April 15, 2008
Copyright 2008© LRP Publications