**MEASURING RISK EXPOSURE CONSISTENTLY**

**Create a Level Playing Field**One of the many challenges that must be addressed in an ERM framework is ensuring a level playing field when allocating risk management resources to treatments. A level playing field requires that consistent definitions of “exposure” and “risk level” are followed by managers.

**Exposure Data Farming**In many cases, an organization will estimate an exposure based on an 'expected' scenario and a 'worst-case' scenario, and allocate resources based on a prioritization of exposures and associated risk levels. This approach imputes a binomial probability distribution to exposures which may or may not be appropriate depending on the nature of the risk. In addition, this approach leaves room for subjectivity in interpreting what is meant by 'expected'. Finally, including ‘worst case’ without a reciprocal ‘best case’ estimate, fails to position the risk to be viewed from as a potential opportunity.

Let’s assume that what is expected, from a risk level criteria standpoint, is what was planned, since the plan is a reflection of the commitment management is willing to make to stakeholders. If there is no plan, or the plan is zero, then it is important to establish a reasonable benchmark for determining risk levels. If no benchmark or plan is established, the usefulness of reporting is reduced and the allocation resources for treatments may not be objective, equitable, or optimal. For example, if history indicates that on average a certain plant incurs $1 million a year in maintenance costs related to catastrophic events, but no benchmark is applied in determining analyzing risk likelihood or consequence, then the 'expected' scenario becomes $0. What then is the risk level? From a planning standpoint, the risk level is an expected variance in earnings of ($1M), and from a risk management standpoint, the risk level is the worst-case scenario. Compare this to another plant in the same organization, with the same risk, that includes the observed $1M historical cost in its plan for catastrophe related maintenance costs. The second plant’s plan risk analysis will have a risk level of $0, and their exposure from a risk management perspective becomes the worst-case scenario less the anticipated $1M. If resources are being allocated based on these risk levels, then the first plant will receive more than it should, seemingly rewarding ignorance. Furthermore, if risk management resources are allocated based on expected payoffs, the payoffs will be inflated for risks modeled without appropriate expectations.

One solution for overcoming this potential faux pas is to implement a consistent planning process that requires realistic plans based on historical observations, uniformly across the enterprise. However, anyone familiar with planning knows that this is not always practical or reliable. The best solution is to establish a risk management policy for defining the criteria used for deriving risk levels that is linked to historical actual data, or some other common standard.

While it may not be practical for all risks at the outset, if the approach is implemented methodically starting with the highest magnitude risks, and expanded incrementally, then over time the repository of risk level data becomes a valuable risk management asset. The value comes from the ability to farm the data and consistently state with confidence what the expected and worst case scenarios are, and allocate risk management resources more equitably across the enterprise risk portfolio. An added benefit is the ability to apply consistent payoff hurdles (e.g., potential payback, or ROI) across the enterprise.

**Standardizing Tolerance**

The above example discusses “expected” versus “worst-case” scenarios in estimating exposures, and the implicit binomial probability distribution (exposure will either be the expected scenario, or it will be worst-case) therein. While some exposures do follow the binomial distribution, the goal of an ERM framework should be to develop the skillsets within the extended risk management community to apply sound quantitative methods that have predictive value. In the example above, data would exist over time to calculate Poisson probabilities for catastrophic events along with the average and standard deviation for the cost of catastrophic events. Armed with this data, risk managers can calculate low, expected, and high end estimates for catastrophic events as well as the average cost, and use this data to develop low, expected, and high estimates for total exposure.

A data-centric approach solves three problems at once. First, it demystifies the “worst-case” scenario and removes any perceptions of subjectivity in calculating exposure. Secondly, it sets the stage for enforcing a consistent risk appetite. How? Instead of defining your tolerance in terms of total dollars, or a worst-case scenario, organizations can manage to a specific tolerance level in terms of alpha. Your policy can mandate that for catastrophic events, risk levels be based on an alpha level of 5% (a 95% bell-curve with 2.5% on each end) for both likelihood and consequence. The organization then knows that in determining risk levels, the worst-case scenario is the number of events that corresponds to the 2.5% right tail probability given their Poisson probability, and the upper limit of a 95% confidence interval for consequences. The caveat is that these probabilities must take into consideration the changes over time in instances of likelihood and scope of consequence – in other words, the data used in the estimates needs to be indexed to changes in the maximum potential loss over time. Once the likelihood and consequence ranges are defined according to the consistently applied tolerance standards, the product of these likelihood and consequence distributions can be used to calculate the expected and worst case scenarios, in addition to best case scenarios. In addition, the historical data can be used to enforce the expected scenario.

This approach improves fairness, transparency, and quality of reporting. Fairness is improved because the ERM framework has leveled the playing field with policies that standardize the definition of risk level and tolerance. Transparency is improved because the approach relies on actual historical data. Reporting is improved because all exposure is defined based on uniform calculations of expected and worst-case scenarios. The most significant benefit of this approach though is that it is easier to repeat a standardized approach and thereby easier to disseminate risk management processes throughout the organization.

**How RiskWaves Helps**RiskWaves ERM system is designed to support the ISO 31000 Compatible Framework for Implementing ERM. Our system allows you to establish the context for your risks, define risks and exposures, analyze exposures in a variety of probability distributions when applicable, evaluate and treat exposures, and monitor, review, and communicate risk management activities. The RiskWaves Exposure Analysis module allows you to copy historical data directly into RiskWaves for likelihood and consequence and convert the data into confidence and prediction intervals at various alpha-values, generating financial estimates in a matter of seconds. If your exposure is more complicated, you can build a customer regression formula to develop estimates. You are invited to try RiskWaves for 60 days, risk-free. Your download includes a thorough user guide and an over 80 pages of visual examples. The system can be used on your computer, or implemented in a locally hosted or cloud based client server capacity for multiple users.

*Edited by user Tuesday, March 24, 2015 11:55:28 PM(UTC)
| Reason: Administrative*