Yale School of Management

Program on Financial Stability

Improving our understanding and management of systemic risk.

Fixing Financial Data to Assess Systemic Risk

December 2, 2020
: By Greg Feldberg

This post is a report, which was produced by the Brookings Center on Regulation and Markets. The original report can be accessed here

Contact Author

Access our live “financial intervention” tracker, to keep up with the latest financial crisis fighting interventions by central banks, fiscal authorities, and international organizations here.

The coronavirus-induced market stress in March provided new evidence that the public and the authorities still don’t have the data they need to track and analyze risks in the financial system. Authorities were unable to answer basic questions as markets spun out of control. Who was selling billions of dollars’ worth of U.S. Treasuries, which many believe are the safest assets in the world? Who had too much short-term leverage in repurchase agreements (repos)? Who was exposed indirectly through their debtors and counterparties?

Answers to these questions would have been useful. But the Treasury market, the most liquid fixed-income market in the world, remains surprisingly opaque. Broker-dealers now report their transactions in the market to the authorities, but banks still don’t, and the authorities share little information with the public. Data remain limited on the uncleared bilateral segment of the repo market and on the activities of hedge funds, whose selling of Treasuries was significant but difficult to evaluate with existing data.

To be sure, the Federal Reserve quickly restored confidence this Spring. But it did so through massive credit operations and a commitment to buy securities, like corporate bonds and exchange-traded funds, that it has never bought before, not even during the 2007-09 financial crisis. Better information may have allowed for a more targeted response in March and a more fruitful assessment afterward.

It wasn’t supposed to be this way. Regulators have much more data now. Since the last crisis, they have pulled derivatives trading out of the shadows; introduced reporting for hedge funds, private equity funds, and money market funds; and asked a lot more of banks.

Congress created an Office of Financial Research (OFR) in 2010 to identify risks and fill blind spots so regulators would have a broader view of “who owes what to whom” across the financial system. The OFR has mandates to improve the quality of data collected, promote data-sharing, and improve public disclosure. It has subpoena power to collect data from financial companies to enforce these authorities. It is also expected to conduct cutting-edge research and create models and monitoring tools that they and the regulators can use to identify potential systemic risks.

But, despite these initiatives, financial data today remain incomplete and often not fit for purpose. Legacy data-collection technologies, old-school thinking, and bureaucratic turf fights continue to hinder the authorities’ ability to monitor systemic risks. Moreover, U.S. regulators have fallen behind the private sector and many of their peers overseas in the adoption of technologies that could revolutionize the collection, management, sharing, and dissemination of financial data.

This paper first describes the unique challenges that financial data present. It then describes a strategy to bring financial data into the 21st century. That strategy would set a timeline for identifying and closing data gaps; improving standards; sharing data, both among authorities and with the public; and accelerating the adoption of new technologies. Finally, it describes the role the OFR should play in driving that strategy, working closely with regulators on the Financial Stability Oversight Council (FSOC).

To implement this strategy, the new Biden administration will have to first remove the roadblocks that have gotten in the OFR’s way for the past 10 years. Those roadblocks include a lack of support from Treasury, where it sits organizationally; sometimes aggressive undermining by private industry and even other FSOC member agencies; and defunding and silencing under the Trump administration.

The presidential transition period provides a unique opportunity to remove such roadblocks in the vetting process for heads of regulatory agencies. The administration should make sure every appointee understands that financial regulatory data management is broken and FSOC member agencies have the responsibility to fix it. Every FSOC Principal should also know that the Evidence Act of 2018 requires them to appoint a Chief Data Officer (CDO) and draft a data strategy. Most importantly, the Treasury Secretary, as chair of FSOC, needs to unify the regulators to support an independent OFR and head off the turf issues that will inevitably arise as it executes its mandates.

There are reasons for cautious optimism that the OFR can lead FSOC toward common data goals. Most FSOC member agencies now have appointed CDOs, each tasked with championing better use of data. Recent laws require federal agencies to improve their collection, management, and dissemination of data. Some agencies have already taken steps to improve the data they collect. The Federal Deposit Insurance Corporation launched a competition to modernize the century-old call report that banks file. The Commodity Futures Trading Commission recently took action to fix derivatives data.

In short, the elements are in place. We still need leadership. This paper argues that the new administration should make it a priority to fix financial regulatory data, starting during the transition. The incoming administration should, first, emphasize data when vetting candidates for top financial regulatory positions. Every agency head should recognize the problem and the roles they must play in the solution. They should recognize how the Evidence Act of 2018 and other recent legislation help define those roles.

Read the complete recommendations here.

US Market Commentary