It has to do with the long-standing, dynamic tension that exists between a natural interest in the protection of sensitive, often confidential data and the need for speed.
Well, in part speed, but equally concerning is the way data is accessed and performance is implicated for the effective use of risk tools and process support.
Like all technology, risk technology has been evolving since the earliest introduction of risk management information systems, a staple of traditional risk management used to manage loss, policy and exposure information.
By definition, this tool had limited use among a small number of users, typically risk management department staff. Platform design and use complexity was limited at best and most of these solutions worked effectively for most users. That remains the case today where these few, typically insurance-related needs are the only concerns.
Unfortunately, where progress is pushing tradition, an emerging conundrum is surfacing. Move beyond the traditional and you begin to experience the myriad of issues that surround the new and expanding data sets that you'll deal with. The challenges are often driven by the rise in the number of system users that you'll need to enable to perform risk assessments across many functional areas.
The issues you'll be challenged by include the complexity of limiting and controlling access to specific data sets; the vast permutations of reports that the various stakeholders desire to be able to issue; the taxing of the system from user numbers and their limited patience for system delays; the difficult-to-predict resource requirements to support a growing and diverse user community; the expectations of senior management and the board for accuracy and currency; and the diversity of user types that drive varying levels of support and training.
Overarching these many issues for the effective use of risk technology is information security. We've experienced this most often in the past in the form of internal control of "confidential" or more sensitive data. Up till now, the need-to-know approach ruled the day and was the relatively clear roadmap to keeping Legal satisfied that unnecessary sharing of information was the best policy to limit the litigation exposure.
However, the conundrum comes from the direction technology is taking, and not just risk technology. That direction is for Web-enabled solutions that often mean hosting of data by a third party on a remote server controlled by that party or perhaps even a fourth party. I think you see where this takes you rather quickly.
That pithy concept of "degree of separation" begins to have a new meaning when it comes to the separation of your control over your data by virtue of this evolving paradigm. Yet many of the performance-related issues that your users care about are often going to be best served by employing this approach.
What's needed now is a concerted effort among risk management stakeholders, to include Legal, to develop a new strategy for information protection that will meet legal and operational goals simultaneously, by recognizing that the world of information management is evolving more rapidly than in the past and that if we're to compete effectively and produce better results, more flexibility along with new and enhanced control strategies will be necessary to meet these goals.
is the enterprise risk manager for a leading financial institution and a former president of the Risk and Insurance Management Society.
March 3, 2009
Copyright 2009© LRP Publications