RiskTech Forum

Risk.net: Banks Told To Seize Moment To Devise New Op Risk Charge

Posted: 6 July 2016  |  Author: Alexander Campbell  |  Source: Risk.net


Rather than battling supervisors’ plans for a standardised measurement approach (SMA) to operational risk capital, banks should club together to come up with an alternative method that preserves some of the better aspects of op risk modelling, a forthcoming report will argue.

The Basel Committee on Banking Supervision outlined the details of its proposed SMA charge in March. If it is taken forward, as expected, it would replace all existing approaches to op risk capital. That includes the simpler standardised approaches and the advanced measurement approach (AMA) – the own-models method used by more sophisticated banks.

Swapping the AMA for the SMA is largely seen as a bad deal by op risk managers. Not only is the new charge expected to increase capital levels, but it is also seen as less risk-sensitive. The SMA relies heavily on internal loss data without any of the forward-looking elements used in banks’ AMA models, such as scenario analysis.

But the report by Chartis Research – owned by Incisive Media, the publisher of Risk.net – argues that the introduction of the SMA is an opportunity to develop a new industry standard. This could be based on the type of forward-looking modelling already used in other industries, and presented as a fait accompli to the Basel Committee before the SMA is implemented. Due to the need for legislative changes, this is likely to happen no earlier than 2021.

“Why don’t the 20 or 30 top banks in the world get together among themselves and come up with a robust alternative operational risk model, using Bayesian methods and causal effects?” asks one of the report’s authors, a senior risk manager at a UK bank. “They can get clever and develop their own model over the next three years – including insurance mitigation, for example – and then go to the Basel Committee.”

Peyman Mestchian, London-based managing partner at Chartis, says such an exercise would provide the industry with an opportunity to move past its focus on regulatory arbitrage and towards best practice in op risk modelling. “The reason [banks] all have their own models is because they are all trying to save capital somewhere by using, for example, a Pareto distribution as opposed to some other distribution; you spend your life getting caught up in your own model instead of looking at how to model,” he says. “[But] the Basel Committee isn’t buying that; they don’t like this gaming any more.”

Mestchian says more sophisticated modelling techniques are already in use in other industries, such as the aviation and nuclear sectors. Modelling a bank’s operations as a Bayesian network, with probabilities assigned to each node in the network, could lend itself well to quantifying tail risk, for example. He notes that a few banks have already received regulatory approval for risk models based on statistical and Bayesian methods.

However, he is downbeat about the chances of this happening in the short term. Most of the largest global banks are currently busy “firefighting” other regulatory and accounting initiatives, such as the US Comprehensive Capital Analysis and Review and International Financial Reporting Standard 9. This is likely to lead to a drop in the level of management attention given to operational risk, he adds.

The Chartis report, New op risk measurement standards: flawed regulations based on flawed models, is due to be published later this year. In addition to urging the industry to unite around a better alternative to the SMA, the report adds to industry criticism of the Basel Committee’s controversial proposal.

The senior risk manager at a UK bank points to the SMA’s overreliance on historical loss data as one of its most salient weaknesses. “The problem with this approach is that zero losses in the past is no evidence that there will not be op risk in the future, and large losses in the past are no predictor of losses in the future,” he says.

While models based on loss data worked well for other risk categories, such as market and credit risk, these are areas with far richer loss data histories, he adds. That is particularly true in the tail of the distribution, which dominates any model-based calculation of op risk capital.

Chartis’s Mestchian agrees. “This is about the tail of the distribution, and you don’t have sufficient tail loss data. If a bank did, it would be dead already,” he says. “In credit risk there are sufficient counterparty failures: there are thousands if not millions of data points. It is a different modelling environment.”

Responding to the report’s findings, one London-based head of op risk is pessimistic about the outlook for an industry-backed alternative to the SMA. Even if industry associations backed a different approach, banks are far from unified on best practice in many areas of op risk management, the head says, and may not be cohesive enough to agree on such a solution in time.

The head argues the only way to combine Basel’s desire for standardisation with some forward-looking elements is to use Pillar 2, the discretionary charge levied by local supervisors. In particular, the head suggests that regulators make Pillar 2 capital more sensitive to control factors and external risk developments, such as the rise of cyber crime and technological changes.

In a recent interview with Risk.net, the Australian Prudential Regulation Authority suggested it would be willing to consider the use of forward-looking elements such as scenario analysis under Pillar 2.