Adler on Data Governance
On September 14, David Bogoslaw published an article in BusinessWeek entitled "How Banks Should Manage Risk." Rick Bookstaber and I are quoted in this article because we first had an interview with David following the XBRL Risk Taxonomy Meeting I hosted at the Levin Institute in New York on May 13, and we had follow-up interviews two weeks ago. As is the case in any press interviews, some of what you say gets printed and a lot doesn't. In this case, I think much of the substance of what I told David was out of scope for the BusinessWeek audience and the goals of his article.
In terms of a banking audience, David gets it all right, and I agree with Rick Bookstaber's comments too. But what the article omits is the fact that from 1999 to July of 2008 the US Congress, the White House, FHA, the SEC, and the US Federal Reserve all participated in an industry-backed weakening of the financial regulatory framework that was built in the 1930's. In 1999, The Financial Services Modernization Act (named Gramm-Leach-Bliley, or GLBA for short, after its authors) removed 70 year restrictions on bank, investment bank, and insurance cross-ownership. At the same time, derivative market oversight was specifically excluded from GLBA and financial markets were allowed to create and trade complex derivative instruments without regulatory reporting or control.
In 2001, President Bush exhorted Americans to "go shopping" to support the US economy following 9/11 and the Federal Reserve obliged by cutting interest rates down to 1% to pump liquidity into the US market. In 2004, Congress lobbied Fannie Mae and Freddie Mac to relax underwriting guidelines on home loans to allow sub-prime borrowers to participate in "The American Dream," and own a home, and FHA provided loans subsidies to make it easier. In 2006, Congress pressured the same GSE's to relax underwriting on Alt-A mortgages, allowing self-employed individuals to declare their income with a signed affadavit instead of documenting their income through tax filings. As I've written in past blogs, that change gave license to mortgage fraud across the country as Alt-A borrowers could make wild income declarations without validation and that pumped tens of thousands of fraudulent mortgages into the global financial system. This change wasn't reversed until July 2008, when the Federal Reserve finaly changed Alt-A underwriting guidelines. The long tail of the bad mortgages underwritten from 2006 to 2008 mean we will suffer significant foreclosure rates welll into 2011, extending the depth and breadth of this recession.
2006 proved to be the top of the Housing Market in terms of house valuations and bank fees generated from loan securitization and derivative markup. The pile-on legislation and market encouragement from Congress, the White House, and the Federal Reserve came from industry pressure to keep the party going as long as possible.
Yes, Banks took on too much risk from 2001 to 2007. But the US Government encouraged and enabled excessive risk taking during that period, and both need to be monitored to prevent future crises. There is an inherent conflict of interest in expecting the government that enabled the current credit crises to participate in the forecasting and prevention of the next one.
There is a history of financial de-regulation followed by marked innovation and crash that goes back 100 years in the US. The innovation generates enormous wealth on Wall Street and new tax revenues for Federal, State, and Local Governments. The relationship between government enablement and financial innovation was omitted in David's account and needs closer scrutiny because policy-makers, and the public, will need new information management tools to realize the impact of incremental policy decisions on financial market performance over the longer term to be able to regulate wisely in the future.
In the article, I recommended that the government create a new Regulatory Information Architecture, modeled on the Information Sharing Councils created by the Bush Administration for terrorism intelligence gathering following the 9/11 Commission Report and the Intelligence Reform and Prevention of Terrorism Act (IRTPA) of 2004. But more is needed.
A year ago, I believed that new information technology and data collection would enable the US Government to better analyze the performance of financial markets and forecast potential bubbles and crisis. I'm sure that enhanced information sharing in the US Government will enable better regulatory enforcement, but it's not enough to prevent future crises. The public needs to play a role in the oversight process because the Government has its own interests which are not always perfectly aligned with those of the public. Administrations change, and with those changes come new philosophies of governing and regulation, and in a Democracy like ours you always want to enable others to regard and report information that others disregard or deny.
Therefore, what's needed is more information transparency about market holdings and the actions of market participants so that anyone in any firm, university, or industry watchdog can analyze nearly the same macro and micro economic data that federal regulators observe and make their own forecasts and predictions.
Without public access to better market data, we are just enabling government to encourage risk taking more efficiently in the future.
You can read the businessweek article here: http://www.businessweek.com/print/investor/content/sep2009/pi20090914_336015.htm
Last week I hosted a Data Governance Executive Breakfast for 20 CIOs in Warsaw Poland. It was my first trip to the Iron Curtain Capital and I expected a concrete grid of grim apartment complexes and monumental communist office architecture. Instead, I found a lovely city still working hard and succeeding to erase 50 years of Nazi occupation, annihilation, and communist oppression. Warsaw today is a gem of a city, with warm and friendly people, beautiful architecture, an eager business atmosphere, and a deep historically rich intellectual tradition.
My one day in Warsaw was graced with gorgeous weather and a terrific morning event that combined both Data Governance content and XBRL. My partner in the Breakfast presentation was Michal Pienchofsky from Business Reporting AG, a Data Governance Council Member specializing in XBRL consulting who is based in Warsaw. Michal gave a terrific presentation linking Data Governance goals and structures to XBRL taxonomies, regulatory compliance, and business optimization.
After the event, I met an old family friend who lives in Warsaw. Stacy is the father of my brother-in-law, and in the summer of 1944, at the age of 16, Stacy joined the Warsaw Uprising and fought against the Nazis. It was a valient and tragic effort that for three months engaged German units in a bloody campaign to win back the Polish Capital. The effort was largely unassisted by both the Americans and the Soviets - who were actually sitting outside the city some 11 miles away and waited for the Germans to mop up the resistance before liberating what was left - rubble - of Warsaw themselves.
It happened that this summer marks the 65th Anniversary of the Warsaw Uprising, and Stacy took me on an uprising tour of Warsaw, showing me the manhole cover where he entered the sewer to cross the city underground to evade Nazi patrols, the intersection where his Gozdawa Battalion setup a barricade, the churches where Nazi tanks hid in waiting, and many walls where bullet holes and plaques still mark the spots where thousands of Polish Civilians were executed by the Nazis in reprisal for the uprising.
We visited the Uprising Museum, which is a fascinating and well done museum documenting the events of the uprising. They have the B-25 that the Polish Government in exile used to send supplies to the resistance fighters, replicas of the sewer pipes that you can walk and crawl through to get an idea of what it was like - without the sewage - and many photos detailing the grim battle and the utter destruction of Warsaw afterwards. The Nazis leveled the city after the uprising was crushed as an example to any other nation that wanted to rise up against their tyrannical rule. Not one building, not one facade even, was left standing in the city.
The lovely inner city that one sees in Warsaw today was complety rebuilt by the Communists after the war. I've been to Prague many times, where it is often remarked that the old city was preserved after the war because the Communists didn't have the money to put up new buildings. I think Warsaw demonstrates the lie of that assumption. Communists obviously love good architecture and cultural heritage as much as Capitalists do, because they did a marvelous job restoring Warsaw to its some of it's pre-war splendor. There are still many sites outside the inner city where scars from WWII are visible. I haven't seen that in other WWII sticken cities, like Hamburg which was 80% destroyed by allied bombs in WWWII. Just across the street from the Hilton Hotel where I stayed there were empty lots and war ruins of buildings, which is quite amazing in the 21st Century.
But in the 20 years since the Iron Curtain has come down, already there are many modern changes to Warsaw and I can well imagine that this city, with its great people, and hunger for innovation, and rich traditions, will regain its former glory as a great city in the 21st Century.
I posted some photos I took while in Warsaw on Picasaweb. Have a look if you are interested:
It was a great trip business-wise and it certainly demonstrated the resilience of the human spirit even under the most barbaric forms of oppression.
Frameworks freeze you in the past, by forcing you to interpret the present based on rigid formulas, interpretations, and even misconstructions. In 2007, the IBM Data Governance Council finished its Data Governance Maturity Model. Looking at all its imitations in the market, one could conclude that it has been remarkably successful.
However, as a benchmark of relative organizational maturity - and not just data management processes - I think its time has past and I'm working on new ideas.
DataGovernor 120000GKJR Tags:  information architecture content compliance data risk 4,671 Views
Data=Information=Knowledge. Or so we would like to say. In theory, data is unorganized information, and knowledge is information put to use by human beings. But theories are for academics. And this theory is super convenient if semantic consistency is important. There are Data Architects who only think about data in databases, Information and Content Architects who only work with unstructured repositories, and even Knowledge Architects who I suppose work with information taken out of human brains and put into... structured or unstructured repositories on computers...
In real life, in real companies, these are artificial distinctions. Organizations want to control data/information supply chains because they are full of quality control problems, security vulnerabilities, compliance challenges, and operational exposures. Those risks imperil decision-making, increase operational costs, and reduce revenue opportunities. Quality control and risk mitigation are challenges for every data type.
Five years ago, "Data Governance" seemed like a great name for a new discipline to help transform organizational behavior from vertical to horizontal; because information is transformational. What we meant then and mean now is not just about "Data" in the purest structured sense. We mean Data in the most plural and unlimited sense. People want to govern other people's use of all kinds of information in every form.
No data stovepipes please! We need Data Governance Solutions for all human uses of information regardless of their form or structure, use or abuse.
Anyone who tells you different is just so 20th Century...
On October 7-9, I will be hosting a conference on The Future of Data Governance at the Mohonk Mountain House (www.mohonk.com) in New Paltz, NY. This event has been designed to explore the challenges and solutions of Data Governance organizations constantly ask about:
1. How do I transform data into an asset? Data isn't an asset until you make it one, and its not an asset like gold, stocks, or oil. Those assets have commodity values based on their scarcity and demand. Data is an asset with infinite availability, so its value can't be based on the amount you own or the amount someone wants. The value of data is purely perceptional, unless there is a market for that data. iTunes, DVDs, Newspapers, and cable TV are all examples of data with values based on market demand through external sales channels.Many organizations in the Data Governance Council have been successful in creating information assets, protecting them from risks, and organizing x-functional participation in Data Governance Councils. And they have achieved some stunning results.But internally, we have no market for data sales. So the best we can do within an enterprise is increase the perceptional value of data as an asset. It has a perceptional value to Business when IT can demonstrate incremental revenue obtained through data consolidation, aggregation, cleansing business intelligence, and new sales.
Five years ago, Mohonk was the venue where I hosted our very first Data Governance event. Back then we organized three tracks to focus on Policy, Content, and Infrastructure questions. We had a lot of questions and ran each track as an interactive forum to frame common issues, understand the dimension of Data Governance, and identify convergent areas our customers wanted to explore. We had long discussions about data supply chains, policies and rules, metadata and data classification, security and risk. The dialog was extremely interactive, and coming out of that meeting there were many who wanted to continue. That was the genesis for the IBM Data Governance Council.
We knew then that Data Governance would become an important field. Some early visionaries like Robert Garigue from Bell Canada, Christa Menke-Suedbeck from Deutsche Bank, Charlie Miller from Merrill Lynch, Ed Keck from Key Bank, and Richard Livesley from Bank of Montreal helped us all to see the dimensions of the emergent market. And it was those leaders who helped to shape the Data Governance Council Maturity Model, which in turn helped define the elements of the Data Governance marketplace.
Of course, what we couldn't see then is how failures in Data Governance would threaten the world economy itself. The Credit Crisis was caused by incremental policy failures in almost every stage of the mortgage data supply chain. Loose credit led to bad home loan underwriting decisions, which were masked by rising home values. Huge fees in MBS and CDO trading led to inside-deals with credit rating agencies and banks and vast amounts of poorly documented mortgages came to be regarded as Tier 1 assets on many balance sheets around the world. These instruments were insured by complex derivatives traded without clearinghouses and created interconnected obligations among the largest banks with huge exposures should any one of them fail.
The media has focused on the wide segment of the funnel, the derivative market failure. Credit Default Swaps in this market had a notional market exposure exceeding $100 trillion. But the failure was within a supply chain and poor underwriting standards in loan origination from 2005 to 2008 continue to pollute banks with Toxic Assets and the long tail of mortgage foreclosure haunts our economy. Our mortgage market remains heavily discredited around the world and new Data Governance solutions are needed to restore investor confidence in the US Mortgage Market.
I've been working with a range of policy-makers and thought leaders on providing concrete solutions to those challenges, and I will host a round-table discussion on US Housing Data as a use case example on the value of data, the terrible risks that can still plague our economy from data pollution in that supply chain, and the concrete steps that can be taken now to address these issues.
I think this conference will be thought provoking and practical. The market is looking for Data Governance solutions. Not just know-how and not just software. But know-how and software and examples how to apply them. That's what we'll do and I hope you can join us. I think it will be the best Data Governance Conference ever. The venue is fantastic, the room rate unbelievable, and the conference fee is a true bargain.
This agenda will continue to evolve, so come back often for updates.
Directions to Mohonk
DataGovernor 120000GKJR 2,776 Views
Tim Geitner is on Capital Hill today asking Congress to provide regulators with new powers to control the derivative markets. He claims that derivatives blindsided the Administration and nearly destroyed the world's economy. Congress, by all accounts, seems willing to provide these new powers to both the SEC and the CFTC, which will include the power to collect positional data from key broker/dealers and enforce positional limits on trading to constrict bubble formation. These are good ideas. But these powers alone won't fix the current problems in our economy or prevent future financial catastrophes. They are at best solving a symptom of the credit crisis, not the source problem.
The source problem was bad home loan mortgage underwriting, and those bad loans continue to produce 12% foreclosure rates that have not abated. That problem was the result of misguided policy mistakes by Congress and FHA in 2004-2006 that were not corrected by the Federal Reserve until June of 2008. The tail of those bad loans still haunts our mortgage market. The non-GSE mortgage market today is effectively dead. Anything without a federal insurance program isn't being underwritten, and none of the policies of the Obama Administration have changed the nasty state of foreclosure nationwide.
Rising unemployment across the country is adding to the delinquency and foreclosure rates. Tinkering with the derivatives market is a nice sideshow. But until the Obama administration gets serious about mortgage reform, they are only addressing the symptoms of our problems not the core.
Last week, I became a victim of toxic content. It can happen so fast, without warning. My sister, a trusted source, forwarded two photos that purported to show the Air France flight breaking in half before it fell from the sky into the Atlantic off the coast of Brazil. There was a caption that said the photos had been taken by a passenger, and while the camera had been destroyed in the crash the memory stick was recovered. Even the photographer's name had been discovered by tracing the serial number of the camera. One photo showed passengers with air masks on, a gaping hole in the mid section of the plane and the tail section falling away. The second photo showed a man being sucked out into the open hole.
They were immediately shocking photos, all the more so to me because two of my students from my Data Governance course at the Bucerius Law School died on that flight. Alexander Crolow and Julia Schmidt were two bright young students from Germany and Brazil who had traveled to Brazil to tell Julia's parents of their plan to marry and were returning to Germany that night to tell Alex's parents. An event like the Air France crash it transformative when you know someone who was on it.
But alas, the photos were fake. They were taken from the TV Show lost and sent around the world in an email. Bolivian TV even showed them on the air before they discovered the fakery. But by then the damage had been done. For so many people around the world wondering how their loved-ones perished in that plane, the photos offered chilling illustration. We should have recognized the forgery at the outset since the plane crashed at night and the photos showed bright daylight through the hole. But critical thinking disappears quickly when you are emotionally involved. And of course on the internet any trusted source can inadvertedly be a conduit for toxic content. Thus knowing the source of your content is not enough to establish trusted information. You need to verify by corroborating the content with another source to establish veracity.
In the 21st Century everyone has to be a journalist.
While academics contort over the rise of successful bank lobbying on Capital Hill, Jack Reed has introduced the Rating Accountability and Transparency Enhancement (RATE) Act of 2009, which "would provide new oversight and transparency rules for Credit Rating Agencies." This is a serious bill with excellent ideas that will do more to correct one area of abuse in the credit crisis than many other current proposals. Credit Rating should be transparent so that market participants can validate rating methods and the SEC can provide oversight and audit over problems and failures.
RATE includes further strengthening of existing regulatory structures, with new authorities provided to the SEC. But the important component here is new rating disclosure requirements which would make the methods credit rating agencies use to rate bonds, MBS, CDO, and other derivatives transparent and auditable. I also like the proposal for a new independent Compliance Officer, which is a power long overdue in ALL corporations.
SUMMARY: The Rating Accountability and Transparency Enhancement (RATE) Act of 2009 (http://reed.senate.gov/newsroom/details.cfm?id=313172)
The bill strengthens the Securities and Exchange Commission’s (SEC) oversight of Nationally Recognized Statistical Rating Organizations (NRSROs) through enhanced disclosure and improved oversight of conflicts of interest, and makes credit rating firms more accountable through greater legal liability.
Accountability of NRSROs
• Holds NRSROs liable when it can be proved that they knowingly failed to review factual elements for determining a rating based on their methodology or failed to reasonably verify that factual information.
• Requires the SEC to explore alternative means of NRSRO compensation, and requires a Government Accountability Office study on payment methods, in order to create incentives for greater accuracy.
• Establishes an office in the SEC to coordinate activities for regulating NRSROs.
• Directs the SEC to ensure that NRSRO methodologies follow internal NRSRO guidelines and requirements for accuracy and freedom from conflicts of interest.
Due Diligence Certification
• Requires certification if due diligence services are used to ensure that appropriate and comprehensive information was received by the NRSRO for an accurate rating.
• Requires NRSROs to notify users when model or methodology changes occur that could impact the rating, and to apply the changes to the rating promptly.
• Requires the SEC to establish a form for NRSROs to provide disclosures on ratings, including methodological assumptions, fees collected from the issuer, and factors that could change the rating.
• Requires NRSROs to provide rating performance information, such as information on the frequency of rating changes over time.
Conflicts of Interests
• Requires NRSROs to have an independent compliance officer to manage conflicts of interest and independently review policies and procedures governing ratings so they are free from conflicts.
• Requires the SEC to regularly review NRSRO conflict of interest guidelines.
• Creates a look-back provision requiring that if an NRSRO employee later becomes employed by an issuer, the NRSRO must review any ratings that the employee participated in over the previous year to identify and remedy any conflicts of interest; and provides for SEC reviews of NRSRO look-back policies and their implementation.
I see this bill as another indication that financial regulatory reform will fix underlaps and gaps in existing authority rather than build a new systemic risk regulatory institution.
DataGovernor 120000GKJR Tags:  senate systemic agriculture risk data governance bookstaber sec cftc 5,287 Views
Agriculture is not the first word that comes to mind when contemplating systemic risk regulation, but the Senate Agriculture, Nutrition, and Forestry Committee was the gladiatorial arena for systemic risk regulation of derivatives last week. Agricultural commodities are traded on the Chicago Mercantile Exchange and the Commodities, Futures, and Trade Commission (CFTC) regulates commodities trading, and the Senate Agriculture Committee oversees CFTC. A week ago, the Senate completed nomination hearings for Gary Gensler, the new CFTC Chairman. Gary's nomination was approved unanimously by the committee, and his participation in the hearings last week on "Regulatory Reform and the Derivatives Market" was his 8th day on the job. But judging by his testimony performance, it is easy to see why both Democrats and Republicans love him. He's smooth, diplomatic, and combines left and right positions in the same sentence. Other expert testimony came from:
Ms. Lynn Stout
UCLA School of Law
Los Angeles, CA
Mr. Mark Lenczowski
J.P. Morgan Chase & Co.
Dr. Richard Bookstaber
New York, NY
Mr. David Dines
Cargill Risk Management
Mr. Michael Masters
Masters Capital Management, LLC
St. Croix, USVI
Mr. Daniel A. Driscoll
Executive Vice President and Chief Operating Officer
National Futures Association
Lynn Stout and Michael Masters presented populist, anti-establishment, arguments for regulatory reform. Mr. Masters has impressed me in the past with his presentations on derivative markets, and in his testimony he pushed hard for notional derivative clearing and exchange trading. Mark Lenczowski and David Dines toted the bank party line on the need for choice in derivative markets, the complexity of the OTC market, and the extra costs standardization of derivatives would add to transactions. Rick Bookstaber made some reasoned and logical remarks about how easy it would be to standardize derivative trading and why it would be desireable to put it into an exchange. He said that the opacity of derivatives makes them the weapon of choice for gaming the regulatory system, that banks use them to acheive investment goals that hide leverage, skirt taxes, and obfuscate investor advantage.
The key battle positions now are:
Conservative: Leave things as they are with greater capital and margin requirements, some transactional reporting. The banks contend that exchange trading is an option in today's market but that customers should decide whether they want to buy derivatives on exchanges or via OTC. Banks already face Capital and margin requirements on derivative trading, so new limits would largely impact non-bank derivative market players. An enhanced status quo seems unlikely, and I think the banks know this and thus are taking this position as a negotiating tactic to limit the Moderate choice.
Moderate: Force derivative trading into clearing houses, require capital and margin requirements, set new position limits on holdings, and use TRACE to track market transactions. This is the essence of the Geitner proposal and Mr. Gensler espoused this position eloquently. I also believe that the banks are comfortable with this solution, because they created the clearing houses and have enormous influence there. The new capital and margin requirements would make benefit the 14 primary broker dealers and if the banks are going to give up some opacity through clearing houses they want at least to ensure a cartel status for derivative dealing. Because Gensler and Geitner are already on board with this, and bank lobbyists are behind their support, I see the moderate option the most likely.
Liberal: Force derivative trading into an open exchange in which all transactional volume, price discovery, bid/ask, etc is fully transparent. This option creates the greatest market efficiencies and allows any dealer of any size to participate in a very liquid and open derivative market. In the beginning, there would be some semantic challenges packaging bespoke derivatives into mass-customized and standardized products. But the data models and technology exists to perform these data gymnastics and the industry would, over time, become adept at provide customized derivative products in standard offerings. In an exchange, it is harder for banks to game the system, and the benefits of derivative trading are more widely shared. Thus, banks want to avoid this. Unless Obama comes out in favor of exchanges, I see the Liberal option falling to the bank cartel.
The challenge with any of these scenarios is enforcing positional limits. CFTC, and the Senators, want the regulatory power to impose position limits. This would entail positional reporting and some kind of kick-back function at the clearing house or exchange to limit registered broker/dealer transactions. But the technical solution has some complexities not obvious to the untrained senatorial eye...
A derivative position is not the same as an equity position. When I own two shares of IBM Stock, they are two units of the same instance. When I own two XYZ currency swaps with the same maturity date, they are two instances of the same unit, and they may also have other characteristics that make them different. It is not possible to add up all the derivative units at the end of the day and compare them in the same way as you might with equities. You have to record each transaction and tally up the common elements, and then you need to analyze all the composite positions to determine what they mean.
One imortant thing that all the panelists missed is the fact that it is not possible to standardize derivative products, per se. It is the components and their semantic definitions that can and must be standardized. That is, a Chevy and a Ford are both cars but they are different types of cars. Yet both have standardized components (often made by the same parts suppliers) that make them subject to classification and their functions interchaneable. We need the same kind of classification of derivative components, so that every buyer and seller can set the features they want for the financial goals they have.
By standardizing derivative components, and plugging them into a configuration engine, it will be possible for an exchange to offer customizeable derivative products to any buyer and seller in the same way as banks do today via the OTC market. The conditions may vary, but the components will be interchangeable. This is the dirty little secret banks don't want anyone to know. Because when exchanges can offer mass-customized derivative products, the huge transactional fees that banks derive from the opacity of risk will evaporate...
A few months ago, the big talk in DC, NY, and among academic circles was that the CFTC would get merged into the SEC, and that the Fed would assume responsibility as the systemic risk regulator. I think that talk is now dead.
Last week, Mr. Harkin, Chairman of the Committee, and Mr. Chambliss, the ranking republican, made many mentions and requests of Mr. Gensler on his resource requirements for regulating derivatives in CFTC. Mr. Gensler mentioned that the CFTC is woefully underfunded, with only 570 people on staff, and the commission would have to double in size at least to manage the complex derivative market. Harkin and Chambliss made it quite clear that Mr. Gensler would be getting new authorities and new funding, signaling to Treasury that CFTC will remain independent and overseen by Harkin and Chambliss in Senate Agriculture, thank you very much.
Power being what it is, the deck chairs in Washington will not be changed. Systemic Risk will be regulated in parts and pieces. I predict we have Systemic Risk Governance Councils in our future and that all the major regulators will get new authorities, new funding, and oversight from the same crusty old men and women in Congress who failed to oversee and fund them correctly prior to the crisis...
ComplianceWeek covered the XBRL Risk Taxonomy Forum Meeting in NY last week with an excellent article enclosed here.
It is a longer article, but this is from the front page:
Using XBRL to Attack Systemic Risk
By Todd Neff — April 7, 2009
Already hard at work making Security and Exchange Commission filings interactive, XBRL technology now finds itself at the heart of plans to save the U.S. financial system from future calamity.
A group of risk-management leaders in the financial industry has begun studying how XBRL might bring clarity and transparency to the murky world of financial risks, much the same way Corporate America has just begun using XBRL to bring more clarity to financial statements.
While any such system is a long way off, proponents say the technology is tailor-made to help regulators (and investors) root out hidden threats to corporate balance sheets before they, well, break the bank. XBRL could, for example, let a regulator peer through a bad debt line item and see the individual loans feeding it; that task would take hours of spreadsheet diving today.
But XBRL could also do much more. Steven Adler, director of IBM Data Governance Solutions, says the computer language provides a standard vehicle for regulators to track not only weeks-old summary data, but also financial positions accruing across many banks and market segments. That would shed more light on systemic risks—which, left unchecked, can bring financial calamity of the sort we’re witnessing today.
Any potent XBRL-based scheme to report risks, however, would require the reporting of daily financial positions, a major shift in how trading firms, hedge funds, and investment banks do business. To that end, Adler’s IBM Data Governance Council is spearheading a movement that would change how investment banks and hedge funds interact with regulators.
“At this point, everybody is aware change is coming,” Adler says. “And parties would rather be in the room together talking about common solutions.”
A speech Federal Reserve Chairman Ben Bernanke delivered last month shows him to be in agreement. Bernanke advocated taking a “macro-prudential” approach to risks that are “cross-cutting,” affecting many firms and markets or concentrating in unhealthy ways. It would involve “monitoring large or rapidly increasing exposures—such as to sub-prime mortgages—across firms and markets.”
You can read the full article here.
On February 26-27, I hosted an XBRL Risk Taxonomy Forum in NY at The Levin Institute in which we explored the concepts of operational, market, and credit risk. Through interactive discussions, we looked at how those concepts could be articulated in an XBRL Taxonomy and what benefits regulatory authorities and market participants could derive from new key risk indicator monitoring. We looked at the ORX example of Operational Risk loss event reporting and saw how 50+ existing banks are sharing operational loss data to better trend individual losses and learn x-industry loss patterns.
And on the last day, we explored positional reporting as a key risk indicator of market crowding and bubble formation. One outcome of the meeting was a call for a followup meeting to review the ORX example in greater depth and explore both existing risk reports and sources of positional data.
On April 23, we will meet again at the Levin Institute to focus more deeply on the ORX data model, an examination of existing regulatory reporting, and positional reporting options from Swift and DTCC.
The work will be done in English – no XML – to make it easy for everyone to participate. Our goal is to answer some fundamental questions:
1. Is the ORX data model sufficient for Operational Risk reporting on a national level?
2. What is the right business model for Operational Risk reporting and who should maintain the taxonomy?
3. What kinds of key risk indicator data are already collected by financial regulators that are either not used on a systemic basis or not shared across the government?
4. What is the most efficient method for collecting end of day/week positional data?
- from market participants directly?
- via clearing and settlement firms?
5. What should be the role of a semantic repository in the construction of risk reporting taxonomies?
6. How should the regulatory authorities build and maintain regulatory taxonomies?
7. How should the world maintain semantic consistency between many regulatory taxonomies?
8. What should a 21st Century Regulatory Information Architecture look like?
We can't possibly answer all of these questions in one day, but we can begin an informed dialog and encourage global participation - No one else is addressing these issues and I think we can make a difference doing so.
I look forward to seeing you on April 23rd.
I've written in the past about the loan origination underwriting failures that are at the heart of the current credit crisis. Market failures in Mortgage Backed Securities, Collateral Debt Obligations, and Credit Default-Swaps can all trace their lineage to high default and foreclosure rates resulting from those underwriting failures. In a piece I wrote in early 2008, I argued that simple changes in underwriting standards could have prevented the market meltdown.
I've also written about the relative efficiency of the Danish Mortgage Model and yesterday I heard an in-depth comparative presentation on that Model that I have to relate because it totally changed my point of view on the Danish Model. Up to know, I had seen the Danish Model as a business platform for mortgage processing. What I saw yesterday is a consumer solution with enormous political appeal.
The meeting was at the American Enterprise Institute in Washington, DC and the speaker was Alan Boyce, CEO of Absalon, the organization that exported the Danish Mortgage Model to Mexico. Alan presented the Danish Model in the context of what the Danes call "The Principle of Balance."
The Principal of Balance enables borrowers to refinance their mortgages when housing prices go up AND sell their mortgage bonds at current market prices when housing prices go down to preserve their equity. In the United States, borrowers can refinance when rates decline and housing prices rise, but they have to suffer negative equity when housing prices decline. Housing prices often decline in a recession, and negative equity restrains labor mobility by nailing home-owners to their existing homes until prices rise and they can sell without a loss.
In Denmark, when recessions hit and housing prices fall, borrowers can sell their straight securitized bonds in a secondary bond market and refinance their mortgage at the current market price for their home. This flexibility protects consumers from negative equity and empowers workers with greater labor mobility.
From Alan's charts, Here is how the current system in the US works:
If Interest Rates decline:
If Interest rates decline:
This model doesn't perfectly preserve home equity as home owners will suffer some loss when housing prices decline, but the loss is substantially mitigated and this system offers individual freedom and choice. It is actually far more market oriented than the current US model.
In the US, we currently suffer 10% default and foreclosure rates, and there are an additional 15-20% who suffer negative equity in their homes but are not at risk of foreclosure. People in foreclosure can't take advantage of a new Principle of Balance Mortgage system, but the government can offer programs to restructure their mortgages at market value. Those with negative equity could be encouraged to migrate to a new Principle of Balance mortgage model.
This is an idea that has enormous benefits all around. It can help the Obama Administration reprice existing toxic assets. It can help provide more market-flexibility to home-owners. And it can repair confidence in the American mortgage market among investors world wide.
Who would have thought that market-oriented reforms would come from such a "socialistic" country like Denmark!?
I encourage everyone to read Alan Boyce's presentations and white papers. It is one of the most intelligent and easy to implement regulatory reforms I have seen in many years.
His full presentation: https://www.ibm.com/developerworks/blogs/resources/adler/20090325_1.pdf
His short white paper: https://www.ibm.com/developerworks/blogs/resources/adler/20090325_3.pdf
DataGovernor 120000GKJR 2,549 Views
In today's American Banker,
Allan Mendelowitz and John Liechty wrote a viewpoint article calling for the creation of a National Institute of Finance to be a data collection point and archive of financial information. They write:
"As a solution, we propose the creation of a National Institute of Finance, which would serve as a national data archive and think tank for the financial regulatory community. The mission would be to provide regulators the data, software, computing power and analytic capacity they need to oversee and safeguard the health of the modern financial system.
A centerpiece of the NIF would be a Federal Finance Data Center. Participants in the U.S. financial and insurance markets would be required to report all positions to the center at regular intervals. Crucially, these reports would include both exchange-traded and over-the-counter contracts, complete with counterparty relationships. The center would let regulators assess systemwide contagion and concentration risks, perform stress tests, including the impact of the failure of a large institution, and hence make better-informed decisions in times of crisis.
Given the sensitivity of this data to financial institutions and, in broad terms, America, top-level data security would be essential. Financial institutions would have serious, legitimate concerns about third parties learning details about their positions. But the military and national security communities provide functioning models for handling such problems. More to the point, there is no alternative. Without detailed information on transactions, positions and counterparty relationships, any attempt to identify systemically important institutions is guesswork.
A second major role of the NIF would be to maintain a research center, including a National Risk Lab with skilled staff and facilities analogous to those used by the quants on Wall Street. In the spirit of the National Academy of Science, the NIF would act as a clearing house for ideas and problems between industry and a broader research community — identifying essential problems, sponsoring the sustained research efforts needed to solve these problems and integrating solutions into a rigorously tested, well-understood set of models."
If you read my last blog, you might think I would agree with this idea, but for some practical and philosophical reasons I do not.
First the practical. The government already collects quite a lot of financial data from many regulated institutions that it does not use well, share between agencies, or compare across companies. But the data is there, and building an effective information architecture to better leverage, analyze, share, and compare the data is cheaper, easier, and more effective than building a new academic think tank.
Second, the philosohical. In a democracy, a single aggregation point of information is a single point of control. That control can be abused like any other power, and the information can be restricted or changed based on the political beliefs of whomever governs the data center. And since our government is controlled by politicians, we don't want that.
What we want is multiple aggregation points and lots of access-restricted, and anonymized data sharing across multiple agencies. We want information innovation to grow in different parts of the government given different levels of relative maturity and interest, budget and skills. We want organizational freedom to invent new ways to collect and use information collected through reporting, audits, hearings, and investigations.
We want our regulatory authorities to develop their own best practices and share them x-government, and that kind of innovative environment can only thrive when there are multiple sources of data collection, common entity registration, and empowering technology that makes sharing, analysis and interpretation easy.
Building this isn't difficult, and technology can overcome the organizational obstacles that prevent data sharing among agencies today. But aggregating data responsibilities in a new organizational structure will only create a new fiefdom for someone else to control.
I think we can do better.
DataGovernor 120000GKJR 2,675 Views
In yesterday's Financial Times, Hank Paulson, the former Treasury Secretary, wrote an article entitled "Reform the Architecture of Regulation." In the article, Hank blames inadequate regulatory authority and overlapping jurisdictions for the failure to forecast and prevent the current credit crisis. He recommends an ideal regulatory infrastructure composed of three agencies: "one charged with maintaining market stability across the entire financial sector, one for supervising the soundness of those institutions with explicit government support, and one reponsible for protecting consumers and investors."
Hank wants the Federal Reserve to have the systemic risk authority in the first case. He wants the Fed "to have access to information from a broader set of financial organizations, including hedge funds and systemically important payment systems. This authority should also have the power to intervene if it concluded that the financial system was at risk."
He goes on to say that both the Treasury and the Fed lacked appropriate powers to allow Lehman Brothers to wind-down in an orderly way, and of course that might be true but the Lehman failure was not in the lack of orderly bankruptcy. The failure was due to Hank Paulson deciding to let Lehman brothers fail in the first place. AIG, Fannie Mae, Freddie Mac, Bear Sterns, Merrill Lynch were all too big to fail, but Lehman wasn't. When the government picks winners and losers in a time of national crisis, the public is the ultimate loser.
It must be nice to look back on past failures and propose future solutions, but Hank's analysis omits too much.
1. This crisis was preventable. The government had the data, and the Federal Reserves own economists admit that in a report written in October 2007,
"Were problems in the subprime mortgage market apparent before the actual crisis showed signs in 2007? Our answer is yes, at least by the end of 2005. Using the data available only at the end of 2005, we show that the monotonic degradation of the subprime market was already apparent. Loan quality had been worsening for five consecutive years at that point. Rapid appreciation in housing prices masked the deterioration in the subprime mortgage market and thus the true riskiness of subprime mortgage loans. When housing prices stopped climbing, the risk in the market became apparent."
2. The government is in fact awash with the right data that provides leading indicators about many aspects of the economy. Individual agencies collect that data and either do not understand it, do not compare it across companies or geographies, or do not disseminate in an intelligent manner.
3. In some cases, the government in fact lacks important data that can yield indicators of systemic risk, but initiatives like the XBRL Risk Taxonomy can remedy these deficiencies quickly.
Changing the organizational structure of financial regulation will be disruptive and will not necessarily produce better results. Existing structures with new authorities can produce the same or better results.
But the government does need a Business Intelligence strategy to make better use of the information it already collects, and integrate new sources effectively. This isn't just about reporting standards. Every day, auditors request information from financial firms. Regulatory auditors are often camped in regulated firms for extended periods. They collect all kinds of important data about business practices, assets, liabilities, and losses. Where does this information go? Is anyone collecting it using standard practices and integrating the structured and unstructured content into comparable repositories? Can anyone compare practices from firm to firm without having participated in site audits?
I suspect the answer to these questions is no, and we are all witnessing the results of the governments' failure to use its own information intelligently.
In any administration, pro or anti regulation, it will always be in someone's interest to disregard information that doesn't fit their philosophy or needs. The only safeguard in a democracy is information proliferation to many diverse interests. This enables others to regard information that others disregard, to glean meaning that others miss.
Hank Paulson is wrong. The government doesn't need a new regulatory architecture. The government needs a new Regulatory Information Architecture.
On February 26 and 27, I hosted the 19th Meeting of the IBM Data Governance Council. We met in the fantastic Levin Institute on 116 East 55th Street, and we were 43 people from very diverse backgrounds and institutions - Banks, Insurers, Energy Concerns, Regulatory Authorities, Auditors, Authors, Economists, and IT Vendors. For me personally, this was the most interesting and significant meeting I have hosted in the past five years.
The enunciated goal of the meeting was to explore the hypothesis that systemic reporting of operational loss information from financial institutions to regulatory authorities in a common language could improve the monitoring of systemic risk in financial markets. We explored this idea in the first half-day of discussions, hearing different perspectives on that point of view from the ORX Consortium - who already share operational loss data on a self-regulatory basis among 51 contributing banks - Christian Menegatti, an economist from RGE Monitor, a Risk Manager from a major bank, a retired Operations Manager from another major bank, and a representative from a regulatory authority.
The discussions moved back and forth between economic interpretation, technological implementation, and banking operations. The ORX example I think demonstrated the viability of operational loss reporting. The ORX speaker had copious statistical evidence to share about what has been learned the past three years from the sharing of over 100,000 operational risk loss events, and I think it was clear to everyone that the sharing of normalized data via common interfaces could provide regulatory authorities with new insights on systemic risk.
After lunch, we dove into discussions on the technical capabilities of XBRL. This XML dialect provides a highly structured method to report business information with metadata tags. In many companies, financial analysis is done via spreadsheets, and each spreadsheet can have the same information but different structures. Cell contents can be in different places, have different header names and each change makes reconciliation or normalization difficult if not impossible.
When Lehman Brothers collapsed in September 2008, financial firms had to quickly consult their spreadsheets to trace back how many positions they had with Lehman and what the exposure was in each position. That all of this was done in spreadsheets without common composition meant that armies of human beings had to spend countless hours analyzing this information. The time required to do this meant that precise knowledge of institutional exposure at the moment of market collapse was unknowable for senior management in the banks, and certainly among regulatory authorities.
However, had all of that positional data been written to spreadsheets or databases with XBRL tags, the location of each field in the spreadsheet or its text label would have been irrelevant. The metadata tags would have identified the relevant information to computerized analytical tools, and the financial institutions could have summarized their exact exposure at the press of a button.
It was this use case and discussion that crystalized the value proposition of XBRL and standardized reporting interfaces for everyone in the room. One speaker talked about the need for a barcode system of identification for financial products that XBRL could facilitate, and from this we moved to a discussion on the need for common semantic definitions of each "code" to enable understanding among all financial institutions. Michael Atkins, Chairman of The EDM Council has been a leader in this space, hard at work on building a comprehensive semantic repository of data definitions for 3000 financial components.
Near the end of the day, a regulatory representative spoke about findings from the G20 summit on the six contributing factors that lead to the market failures and credit crisis:
1. Lack of oversight of systemic risk
2. Weak performance of credit rating agencies
3. Unregulated credit pools
4. Shortcomings in risk management practices
5. Financial innovation outpacing regulation
6. Weaknesses in financial disclosure and reporting
These six challenges provided a litmus test for our earlier hypothesis and I asked the audience if systemic reporting of operational loss information would answer these requirements. The group was quick to answer yes, but our ORX colleague pointed out that the total amount of operational risk losses yet accumulated by the 51 member banks in ORX only ammounted to $34 billion, and the credit and market losses since July 2007 already exceeded $1 trillion.
On the second day, we came back to this topic. Our speaker was Rick Bookstaber, author of a book called "A Demon of Our Own Design," and Rick described the concept of market crowding that makes common methods of risk measurement and calculation - Value at Risk, or Loss Forecasting - impossible to implement in the strained conditions of market failure. His solution was to require financial institutions to report end of day positional data to regulatory authorities. Only reports from the top 50 banks, broker dealers, and hedge funds would be requried to measure aggregate exposure in the market at any given time.
The positional data would give authorities a commensurate understanding of asset ownerships across the markets, and would balance operational loss reporting with credit and market exposure. Not all trades would need to be reported, just the end of day positions. Perhaps if Treasury had had this information in September 2007, they could have benchmarked the impact of the Lehman bankruptcy on the market. Perhaps, if they had had this information in July of 2007, they could have foreseen the rapid deterioration of credit quality in the mortgage marketplace. Perhaps, perhaps.
But one thing became clear to all of us in the room as we discussed this idea - it was dependent on a precise definition of "what a position is." In any other industry, products have codes, and supply chains that depend on the precise use of those codes for the movement of goods and services. In the financial services industry, many products have complex definitions that defy coding. In fact, financial innovation is often directed to develop products that defy classification because they combine the properties of insurance, taxation, indexes, and bonds and create new ways to leverage, risk, and earn revenue "outside the box." Many of these products are not traded on open exchanges, have no contracts between counter-parties, and are extremely difficult to trace. And that's the goal...
We all agreed that daily reporting of financial positions would help identify systemic risk but that reporting structure will depend on a new taxonomy to describe financial products. Because without precise definitions of what things are called, the data will not be easy to report or comparable across firms. It will need to be flexible enough to enable financial innovation, so it can't be a glossary of every conceivable element, yet rigid enough to enable precise reporting. And regulatory authorities will need to create new rules that state that financial products can not be traded between any parties unless it is defined using this new taxonomic vocabulary.
Rick Bookstaber has already written about this in his blog here.
Standards are needed in Financial Services. Not just Risk Management Standards. Product, and Research and Development Standards are needed. Without standards on what things are called and how we name them, it isn't possible to report comparable information and discern systemic risk. And if we don't get this right today, we can expect more trouble in the future.
This is what I will call Smart Financial Regulation, and our meeting in NY was so successful in uncovering these issues that I will host another meeting of the IBM Data Governance Council to focus on the topic of Smart Regulation on April 26 at the Levin Institute.
We made enormous progress last week, but so much more work remains to be done. Crowding doesn't just happen in financial markets. It also happens in the debate over ideas. The way to counter-act idea crowding is to ensure diversity of participation in the discussion. By bringing together diverse interests from financial institutions, regulatory authorities, economists, thought leaders and IT vendors, the IBM Data Governance Council can facilitate progress on these topics and I intent to host regular meetings among principal representatives of these groups to develop new reporting standards that enable Smart Financial Regulation.
Stay tuned for an agenda, and get ready to roll up your sleeves.