Adler on Data Governance
From archive: February 2009 X
DataGovernor 120000GKJR 3,146 Views
I've been watching this Madoff scandal unfold with incredulous amusement. Ponzi schemes aren't new but the sheer size and scale of this one I think perfectly epitomizes the gap between financial innovation and regulatory oversight capacity.
Mr. Madoff was certainly a financial innovator. He figured out a way to get thousands of people to part with their money on the promise of consistently good returns and he never executed any investment transactions.
Am I the only one wondering how it is possible that an investment fund can take investor dollars, send out an investment prospectus, never really execute any trades, and that there is no regulatory authority that is connecting the dots between investment claims and results?
I think the answer is that there are many regulatory authorities that have the mandates to monitor these gaps but that none of them have the information infrastructure necessary to connect corporate actions, financial disclosures, regulatory filings, and exchange transactions.
While the Madoff scandal is stunning, one does wonder how many other smaller scandals have not been discovered, or could have been discovered before all investor money was lost. After Congress exhaults in a round of blame and recrimination, I do hope efforts are directed at enabling regulatory authorities to build a 21st Century Information-Driven Regulatory Infrastructure.
Madoff demonstrates the need, now the Government needs to build it.
DataGovernor 120000GKJR 2,517 Views
The XBRL Risk Taxonomy work we've begun has the potential to reshape the dynamics of financial regulation. But XBRL is just a tool, not a solution. The solution that an XBRL Risk Taxonomy can enable is standardized loss reporting from financial institutions to regulatory authorities and back again.
What do I mean by this?
Every regulated financial institution would provide loss event and tail data to regulatory authorities in XBRL via the Risk Taxonomy. With thousands of institutions reporting losses on a regular basis, the repository would grow large quickly, but meaningful trend data would still take at minimum a few years to accumulate. But over time, this repository would not only provide regulatory authorities with a risk pulse on the financial system, but would also enable financial institutions to compare their own loss reporting to industry aggregates to improve trending and forecasting. The key is to link this loss trending information to management decision cycles so that every decision point can be compared to past experiences and future forecasts. This is not to make already risk averse executives hide beneath their desks, but rather to enlighten human decision-making with risk probability information at the point of action, record decisions and results, and constantly learn from mistakes and improve over time.
We humans do this on an intuitive level every day, but our best decisions are dependent on human memory and painful reminders of past individual failures. To combat systemic risk resulting from incremental action, human experience needs to be captured, profiled, and broadcast to more humans who may have an interest to regard what others disregard. This creates autonomic opportunities as well as governance checks and balances.
The Insurance Standards Organization performs this kind of loss data aggregation for a variety of insurance lines and many insurance companies purchase this data via subscriptions to calculate insurance premiums and reserves. Sadly, despite the rapid rise of insurance-like hedging strategies on Wall Street, like credit default swaps and portfolio insurance, no one is using traditional insurance products to cover Operational exposures and without loss history insurance carriers can't price coverage or buy re-insurance.
With several years of loss data accumulated it should be possible to create an open insurance exchange to underwrite the losses with insurance coverage. This would allow banks to transfer operational risks (which are most similar to professional liability exposures) off their balance sheets to insurance vehicles. The banks would pay a premium for the coverage, the market would price risk on a near-realtime basis, and regulators like the SEC could govern premiums and fair trade mechanisms. In some ways this would function like credit default swaps but the trades would be on an open market, and rising risk in financial institutions would result in higher premiums, which in turn could be correlated with equity and bond markets to create additional incentive and penalty mechanisms for risk management.
I think this idea has enormous benefits for many market participants. Risk self-insurance is inherently inefficient capital allocation without deep loss history. In the insurance market self-insurance is most practical when commercial coverage is unavailable or too expensive. In the banking world, banks have self-insured their own losses for decades without empirical risk measurement programs.
Today of course the Taxpayer is providing catastrophic insurance coverage for banking failures and that is the most inefficient coverage in the world!
A better model would price future risks based on past losses and make banks pay premiums for loss producing behavior. The XBRL risk taxonomy can create a data model to facilitate loss history aggregation that can create enough data for accurate underwriting. And when that information can be placed on an open market, banks would have a financial incentive to report losses - because the market would transfer the losses to insurance coverage and banks would have more capital for investments. At a time of capital constraints, this solution has something for everyone - market mechanisms, regulatory reform, and better capital allocation.
What's hard about pricing operational risk coverage is the long tail of losses. Traditional insurance policies, with fixed duration, deductible, incident and aggregate policy coverage won't scale to the volume of loss events and severity tail. An exchange, however, can price large volumes of loss events and tail growth in near-real-time, providing both incentives and penalties for poor risk management in firms that transfer via the exchange. That in turn will transform loss reporting from the cat and mouse game it is within firms today into a business necessity because every unreported loss is a balance-sheet deduction in capital allocation that will get penalized severely by the market when reported late.
Turning this solution into reality will require a new Risk Information Management infrastructure in financial institutions, regulatory authorities, and market exchange mechanisms. It will depend on a common data model, standard risk measurement reporting processes and technologies, and cultural changes on Wall Street. This is why we are starting with an XBRL Risk Taxonomy to standardize loss reporting.
We can't create solutions like these overnight, but by starting with common reporting standards we can inspire a 21st Century infrastructure that regulators can build upon to enable risk analysis and oversight at nearly the same speed the market participants create it.
I'm hosting a meeting on these topics on February 26-27 at the Levin Institute in NY. More information about this meeting can be found here: