7 August 2010
Do the world’s leading banks have a firm handle on the true value of their assets, risk exposure (including the likelihood of their borrowers defaulting), solvency, liquidity and capital strength? And even if they do have such information at their fingertips, are they accurately communicating it into the public domain for the benefit of regulators and shareholders?
Such questions ought to be top-of-mind for any politician or regulator who is seeking to re-regulate the banking sector in the wake of the global financial crisis. They are also questions which—depending on the answers—might cast doubt on the legitimacy of the recent European “stress tests” designed to confirm the capital strength of 91 European banks and their ability to survive another downturn.
The onset of the credit crisis, after which many investors began to doubt the accuracy of the audited accounts of developed world banks, prompted me to start exploring such matters. My suspicion that the answers to the questions might be “no”, “no”, “no” and “no” has intensified in recent weeks; it was reinforced by a recent article in The Economist.
The article, headlined “Computer says no” and published on July 24th, quoted an anonymous ex-RBS source as saying: “The reality was you could never be certain anything was correct … Reported numbers for the bank’s exposure were regularly billions of dollars adrift of reality. Finding the source of the error was hard.”
This is astonishing. If true, it suggests that, under former chief executive Sir Fred Goodwin, RBS’s internal management systems had been allowed to atrophy to the extent they had become incapable of channeling accurate information to those at the top; and that scope for inaccuracies in the bank’s accounts—audited by ‘big four’ accountancy firm Deloitte—must have become immense.
The anonymous quote corroborates my own research into RBS. It seems that Goodwin was so obsessed with cost-cutting and empire-building, he refused to invest in the sort of internal IT systems that would have permitted timely and accurate data flows around the bank.
The Economist article points out that the problem is by no means limited to RBS. Indeed it claims that it is common to most of the world’s leading banks, and says that it stems from the fact many are still use the original mainframe computers that they first installed in the 1960s(!) The article adds that “over the years more and more systems have been slapped on”. Most leading banks have grown by acquisition, giving rise to a veritable palimpsest of systems, accounting methodologies and approaches to risk valuation.
Another problem is that, even if the banks did have access to 100% reliable internal data, few would be inclined to communicate this to their regulators or investors. It seems that very few banks have overcome their pre-credit crisis addiction to obfuscation and fudge – even though these precipitated the crisis. A case in point is whether Lloyds Banking Group is being entirely honest about the value of the labyrinthine portfolio assembled by Peter Cummings, the ex-head of corporate lending at HBOS.
Bankers’ continuing desire to drift in a sea of inexactitude became clear in early July, when lobbyists for the US banking sector launched an email and web campaign aimed at seeing off any extension of mark-to-market (“fair value”) accounting rules. The change, proposed by the Financial Accounting Standards Board would force leading American financial groups such as Citigroup and Wells Fargo to write down billions of dollars of assets.
The American Bankers Association fears that the FASB’s proposed extension of fair value accounting to all financial instruments, including loans, as opposed just to securities might make otherwise “strong” banks look as though they are under-capitalized. Plus ça change!
- This article was published on QFINANCE on July 28th, 2010