Risk Accounting, Modeling and Non-Financial Risk: An Argument for Better Integration
We survived the last great crisis. Are there lessons in expected-loss accounting that will help us through the next one?
Friday, August 13, 2021
By Peter Hughes
Not much more than a decade ago, we were in the midst of the global financial crisis. In contrast to the crisis that is now looming - climate change - banks were then the polluters, not with toxic emissions and industrial waste, but with subprime waste that massively depleted the financial system's capitalization. It required the intervention of governments and regulators with a three-stage program to prevent a global economic meltdown:
-- Taxpayers' funds were used to recapitalize the banking system.
-- A regulatory straitjacket in the form of Basel III imposed more stringent controls and limits on banks' risk-taking.
-- The Basel Committee on Banking Supervision removed banks' in-house operational risk regulatory capital calculation models, collectively referred to as the advanced measurement approach (AMA), from the regulatory framework.
The AMA is being replaced with a non-model, standardized capital calculation method that uses risk proxies in place of risk measurement. The effect is to coerce banks into holding eyewatering levels of inert, unproductive, and costly capital reserves to act as a buffer against extreme unexpected losses, irrespective of the quality of banks' risk mitigation activities and processes.
Risk Valuation - An Important Lesson
The three-stage response demonstrated that regulators and legislatures can intervene to good effect to ameliorate the worst effects of ongoing and impending crises where they have the direct influence and powers to do so. However, replicating such remedies to reduce climate change risk across all industries and jurisdictions globally in a coherent and coordinated way borders on the infeasible. There will inevitably be countries and industries that will find ways to block externally conceived restrictions on their commercial and industrial outputs if they conclude such restrictions are damaging to their economic well-being.
Notwithstanding the foregoing, it is evident that, despite more than 20 years of endeavor, the stochastic models developed by banks to value operational risk haven't worked. This is borne out by a Basel Committee announcement in 2016 that its “review of banks' operational risk modeling practices and capital outcomes revealed that the AMA's inherent complexity, and the lack of comparability arising from a wide range of internal modeling practices, have exacerbated variability in risk-weighted asset calculations, and eroded confidence in risk-weighted capital ratios.”
The committee therefore proposed to remove the AMA from the regulatory framework.
When the AMA was first mooted, dissenting voices foresaw this lamentable outcome. One was that of John Sherwood, who in a 2005 article presciently enumerated problems with the AMA. He also referenced the operational risk quantification technique pioneered at Chase Manhattan Bank (now JPMorgan Chase) as a viable and promising alternative. He commented that such a method “may prove to be a more profitable line of enquiry than the continuing work on attempting to develop sophisticated statistical models based on loss distributions. This does however imply that much of the current work on the development of AMA techniques may be redundant and wasted.”
If the purpose of a non-financial risk valuation model is to estimate with reasonable accuracy the probability and severity of future outcomes in multiple scenarios on a worldwide scale, then it needs to be tied to reliable and comparable sources of point-in-time quantifications of exposure to risk. In the case of financial (credit and market) risks, accounting provides that source, whereby the accuracy and comparability of exposure are assured through internationally adopted accounting and auditing standards.
In the case of non-financial risks, an equivalent accounting source is not available, so the banks lobbied the Basel Committee to accept their in-house AMA models that deduce exposure by modeling historic loss distributions.
It is axiomatic to suggest, and the point is made in the Basel Committee's 2016 announcement, that the output of a risk calculation method to determine regulatory capital requirements must be comparable from bank to bank. The comparability requirement could never have been satisfied by the AMA for one reason in particular: the time lag that invariably exists between the emergence of a heightened susceptibility to material unexpected loss - aka “exposure to risk” - and the actual realization and registration of losses emanating from that susceptibility.
Indeed, the losses that reside in the tail of the loss distribution are likely to be many years in the making, and many more years in investigation and resolution, before a realized loss can be recorded. The existence of such time lags invalidates historic losses as a usable source of comparable point-in-time valuations of non-financial exposure to risk.
An Accounting Solution
By comparison, the accounting solution proposed in the RASB white paper is applied to the daily trades and sales registered in official accounting systems. Risk weights are assigned according to each product's inherent risk properties, including environmental aspects, and to the daily volumes and values transacted. The organization's risk mitigation effectiveness is expressed as a scalable index which is the result of benchmarking the status of risk-mitigating processes and activities against industry consensus best practices. These factors are combined in an algorithm that computes point-in-time exposure to non-financial risk using three core metrics: inherent risk, risk mitigation index, and residual risk.
In this way, the outputs of an industry that consumes, produces, and distributes large volumes of materials and products that are high in toxicity and/or combustibility and low in biodegradability will be assigned equally high risk-weights resulting in high inherent risk. If the risk mitigation effectiveness of an organization's production environment is weak, a low risk-mitigation index will be the outcome, leading to high residual risk, which is a measure of the relative damage an organization is inflicting on natural or financial capital.
Residual risks are then valued and accounted for in financial statements in the form of expected loss provisions, thereby assigning full accountability to polluters for the damage they cause.
In business, there is a truism: If you want organizations to focus on an issue, hit their wallets. In the case of non-financial risk, that means introducing accounting and auditing standards that require charging net income and equity with the cost of accepted risks. In the case of climate change, the weaker or less green the risk-mitigating activities and processes, the greater the charge, which adds to the cost of sales and reduces the dividends and bonuses that can be paid to investors and employees, respectively. This is what the non-financial risk valuation method proposed in the RASB white paper is intended to achieve.
If the question is whether the plotting of the roadmap that avoids climate-change Armageddon should be based on modeling or accounting, the lesson learned from the global financial crisis is that it should be based on both. The mistake made 20 years ago, when banks first experimented with non-financial risk modeling, was to believe that the adoption of one approach is to the exclusion of the other. In truth, the valuation of point-in-time non-financial risk exposure and the estimation of the probability and severity of future outcomes requires a fully integrated modeling and accounting solution.
Peter Hughes (email@example.com) is chairman of the Risk Accounting Standards Board. He is also a visiting fellow and advisory board member of the Durham University Business School's Centre for Banking, Institutions and Development (CBID), where he leads research into risk-based accounting systems. He was formerly a banker with JPMorgan Chase.