U.S. regulators lack sufficient data standards ahead of next financial crisis – ex-Treasury official

NEW YORK (Thomson Reuters Regulatory Intelligence) - The U.S. financial system has failed since the 2008 financial crisis to sufficiently improve the standards and quality of data used by regulators and institutions to identify trouble spots, a former senior U.S. Treasury official argues in a new study.

Office of Financial Research Director Richard Berner is interviewed at the Reuters Financial Regulation Summit in Washington, US May 18, 2016.

“Because the next crisis is unlikely to replicate those of the past, we do not know in advance what information we will need,” writes Richard Berner, former head of the Treasury’s Office of Financial Research, the unit created in the aftermath of the crisis to improve financial data quality and standardization.

“Data standardization is critical to that undertaking. This is one of the core lessons of 2008 that has only been partially realized thus far,” adds Berner, who is now at New York University’s Stern Business School.


A growing number of experts have pointed to the explosive growth of financial products outside the traditional depository sector of the economy as a source of concern, and from an oversight perspective, regulators lack sufficient insight into the many linkages between the heavily regulated banking sector and so-called “shadow banking.”

This is particularly the case in the rapid growth of leveraged lending{here}, which has been driven in part by institutional investors seeking higher yielding investments. The borrowing, meanwhile, has come from lower-credit-quality companies who have been turned away by traditional bank lenders, as well as private equity firms who have used such funds to finance company takeovers.

In their study{here}, “The Data Standardization Challenge,” Berner, along with Kathryn Judge of Columbia Law School, cite numerous reasons for the lack of progress made in improving data quality and standardization, not least the fragmented regulatory structure in the United States when compared to the European Union.

“In the United States, the balkanized regulatory structure is an obstacle to adoption of data standards. Regulators are free to use whatever mode of collection they choose,” write Berner and Judge.

Adding to the challenge, there is no single authority that can compel regulatory agencies to use a particular standard. “The Financial Stability Oversight Council, which was created to solve collective action problems among regulators, can make recommendations but it has no authority to require member agencies to adopt data standards,” says the study.

The authors also note that while Treasury’s Office of Financial Research can set standards in data it collects or that member agencies collect on behalf of the Council, “it has no authority to require their use for other data those agencies collect.”


In sharp contrast to the United States, EU regulators have made greater progress on the data front, in particular in the use of Legal Entity Identifiers (LEI), a 20-digit, alpha-numeric code that clearly and uniquely identifies parties to financial transactions -- specifically, the legal entities within companies participating in global financial markets.

The authors note that while the adoption of LEIs in derivatives has been encouraging, progress in other markets has been slow and incomplete. The EU, on the other hand, has driven LEI usage across many markets.

“In Europe, the situation is the opposite: Despite the fragmentation resulting from 28 sovereigns in the EU, (European Securities and Markets Authority) and other pan-European regulators have required the use of the LEI in all but a handful of reporting requirements,” says the study.

“That Europe has moved more quickly than the United States to impose a broadly applicable requirement that parties use the LEI highlights the challenge of entrenched interests and the way newcomers can leapfrog over incumbents,” the researchers added.


Another problem in the U.S. has been costs. While better information benefits both regulators and market participants, the path to get there is expensive and requires everyone to share in the burden. However, as a public good, improved data standards suffer from the classic “free-rider” problem, where many benefit while few bear the costs.

In addition, with so many stakeholders – government agencies, regulatory bodies, banks, brokers, etc. – it is hard to get everyone to move in harmony. The coordination problem is huge, say Berner and Judge.

“Effectively implementing data standardization requires groups who think about issues in slightly different ways--business leaders, risk managers, legal officers, data scientists, and financial economists--to understand how others see the world. They must find vocabulary and figure out collectively how to devise standards that can serve related but not identical aims,” says the study.

The authors argue that more urgent work is needed to fill the various gaps in understanding financial markets, particularly given the speed of innovation.

“As the pace of finance continues to reach ever more dizzying speeds, the value of high-quality information and the threats posed by information gaps continue to grow,” say Berner and Judge. “Given the myriad frictions that stand in the way of optimal policy, leadership, creativity, and a willingness to look to the future, and work across firm, industry, and national bounds is critical to success.”

*To read more by the Thomson Reuters Regulatory Intelligence team click here:

(Henry Engler is a North American Regulatory Intelligence Editor for Thomson Reuters Regulatory Intelligence. Email Henry at

This article was produced by Thomson Reuters Regulatory Intelligence - - and initially posted on Feb. 15. Regulatory Intelligence provides a single source for regulatory news, analysis, rules and developments, with global coverage of more than 400 regulators and exchanges. Follow Regulatory Intelligence compliance news on Twitter: @thomsonreuters