Data on the capital and liquidity of banks are the navigation aids that regulators depend on to avoid another financial crash. Improvements to these indicators, adopted last year by the Basel Committee on Bank Supervision, are among the most heralded regulatory reforms since the 2008 crisis. But what if the instruments are faulty, even in their upgraded form? If so, regulators are flying blind, and our chances of avoiding another crash are slim. What can be done?
A recent paper by three prominent financial economists suggests one possible answer: a sort of Manhattan project that would map out a “risk topography” of the financial system. The authors are Markus K. Brunnermeier of Princeton, Gary Gorton of Yale, and Arvind Krishnamurthy of Northwestern. All three are also affiliated with the National Bureau of Economics Research.( I will refer to the team in what follows as BG&K.)
Their work on risk topography is part of a growing literature on macroprudential regulation of the financial system. Traditional microprudential regulation focused on the safety and soundness of individual institutions. It operated on the implicit premise that if each institution was sound, then the system as a whole would be sound, too. Macroprudential regulation, in contrast, recognises that interactions among disparate institutions—commercial banks, investment banks, hedge funds, derivatives markets, and all the rest—may pose threats to the system as a whole even when each firm taken separately appears sound. The need for better macroprudential regulation was recognised in last year’s Dodd-Frank Act, which created an interagency Financial Stability Oversight Council (FSOC) to deal with systemic risks. It was the subject of an important speech that Fed Chairman Ben Bernanke gave in Chicago last week.
The problem, say BG&K, is that as things now stand, macroprudential regulation cannot be effectively implemented because the FSOC lacks the data needed to measure systemic risk. BG&K compare the situation to that faced by Presidents Hoover and Roosevelt in the early years of the Great Depression. Because national income accounts did not then exist, those presidents and their advisors struggled to develop stabilisation policy using fragmentary data like factory output and boxcar loadings.
Some people have reacted to the data deficit by throwing up their hands in surrender. For example, in a recent Financial Times op ed, Former Fed Chairman Alan Greenspan argued that modern financial markets are “unredeemably opaque,” and that neither regulators nor anyone else can ever “get more than a glimpse” of their internal workings. If so, attempts at macroprudential regulation would not just be doomed to failure, but would have harmful unintended consequences.
BG&K are not willing to accept the opacity of financial markets as irremediable. Instead, they propose developing a whole new system of reporting and measurement, no less ambitious in its scope than the national income accounts. It is hard to summarize the breadth of their proposal in a few words, but some key ideas will give an idea of their general approach.
First, BG&K point out that in assessing systemic risk, it is not enough simply to look at balance sheet measures during periods of calm. Instead, regulators need to know where pockets of risks are building up within the system in ways that are not revealed by existing balance-sheet based measurements of liquidity and capital adequacy. Instead, they propose requiring financial firms to report, on a regular basis, their sensitivity to a list of specified scenarios. For example, firms might be asked to estimate their dollar gain or loss if house prices rise or fall by 5, 10, or 15 per cent, and also how such events would affect their liquidity position. Sensitivity estimates like these are already required as part of stress tests that regulators conduct from time to time, but BG&K propose gathering the data more frequently and from more institutions.
The BG&K approach also focuses on feedback mechanisms between problems of capital adequacy and problems of liquidity. They are particularly concerned with “liquidity spirals” that begin when firms that use short-term funding to finance longer-term investments experience runs or have trouble rolling over short-term borrowing. They are then forced into sales of illiquid assets at fire-sale prices. Those sales, in turn reduce capital and lead to further funding problems. When liquidity spirals, off-balance sheet positions, derivatives, and collateral requirements are taken into account, concepts like leverage and liquidity, which are well-defined in simple, stylised models, become fuzzy and hard to measure on the basis of data derived solely from balance sheets.
Cross scenarios that involve interactive exposure to two or more different risks are the third problem addressed by the BG&K proposal. Their paper uses the example of a U.S. bank that buys Spanish mortgage backed securities, denominated in euros, leaving it exposed both to the risk of falling Spanish housing prices and that of euro depreciation. If both risks materialise simultaneously, the impact on the bank may be greater than the sum of the events taken individually, and may not be adequately revealed by anything the bank would be required to report under the current system.
The end product of the required reporting would be a multidimensionl “risk map” of the financial system that would make visible all manner of risk pockets, sinkholes, pitfalls, soft spots, and other hazards, not only as they exist at the moment, but as they would shift and grow with changes in interest rates, exchange rates, asset prices, and so on.
A final key feature of the BG&K proposal is to make the resulting risk map of the financial system publicly available, just like the national income accounts and the Fed’s flow of funds accounts. Public availability of the data would do more than just increase transparency. More importantly, availability of the new data would stimulate the development of macroeconomic models that better incorporate the financial sector than today’s models do. The authors point out that when national income accounts and flow of funds accounts were first introduced, no one really knew how to use them. Their full value became apparent only over time as models based on them were developed.
Is this ambitious risk topography project feasible? I can see two kinds of barriers to its effective implementation.
First, the very complexity and novelty of the project would make it expensive and time-consuming to implement. True, as BG&K point out, many building blocks of a risk mapping system already exist. Past experience with stress testing of financial institutions provides one building block. Another is provided by the internal risk models already in use by the financial firms for their own purposes. Presumably, well-run firms are already equipped, at the click of a mouse, to answer questions about the impact on their operations of changes in asset prices and exchange rates. Even so, figuring out just what scenarios should be explored, how to assemble the resulting data, and how to ensure its integrity would be enormous tasks. One imagines a long process of sampling, beta-testing, and revision before the system would be up and running. Even then, by its very nature, we would not know whether the whole exercise was worthwhile until it was tested in a real-world crisis.
Second, we can be pretty certain that a large-scale risk mapping project would run into political resistance. BG&K rightly point out that the Dodd-Frank Act already provides the needed legal framework. The Act calls for establishment of an Office of Financial Research (OFR) within the Treasury Department, which, in turn, is tasked with providing research and information to the Financial Stability Oversight Council. The OFR even has subpoena power to require financial institutions to produce the data that it requests.
Legal authority or no, there would be resistance to the idea of using the OFR to undertake the vast task that BG&K propose. Given the tight-fistedness of the current Congress, just coming up with the money needed to staff and operate the exercise would be hard enough. Furthermore, not all of the component institutions of our financial system are as deeply in love with transparency as are critics in academia. One can easily imagine that every appointment and authorization would give rise to the same kind of trench warfare currently being waged over the Bureau of Consumer Financial Protection. That agency, also authorised by the Dodd-Frank Act, may very well end up stillborn, or if not, extensively re-engineered before it sees the light of day. The same thing could happen to the OFR if it were given too large an assignment.
Still, despite the technical and political hazards of a large scale risk mapping project, to give up on the idea in advance would be to admit that the financial system is in fact “unredeemably opaque.” If so, the alternatives are bleak. Regulators would then either have to abandon the concept of macroprudential regulation altogether and passively await the next crisis, or they would run a severely heightened risk that their regulatory initiatives would have harmful unintended consequences.