Adler on Data Governance
From archive: December 2008 X
DataGovernor 120000GKJR 2,379 Views
In the past 18 months, I've written extensively on the need for risk measurement standards, but this IBM Blog interface doesn't make it that easy to find all the articles. To make that easier, I've collected the salient points and made a chronology of them to illustrate the pieces and how they fit together:
February 7,2008: Risk Data for Macroeconomic Governance
On September 18, 2007 the US Federal Reserve cut the Federal Funds rate by half a percent in response to the looming sub-prime loan scandal. The markets had lost confidence and Banks were holding debt they could not sell. Write offs ensued, and the market forecast looked questionable at best.
At the time, this rate cut was seen as a dramatic response to worsening market conditions and proof that the Fed would act aggressively to protect the economy from the housing bubble. In the next two months, the Fed intervened again to cut rates .25% in October and .25% again in December. Each rate cut was seen as a prudent response to market conditions.
In January 2008, just a few weeks after the last rate cut, the Fed had to intervene again with a very sudden 1.25% cumulative rate cut to stem an asian-driven equity market sell-off following more sub-prime write offs and loss disclosures. In just five months, the Federal Reserve had to intervene five times with a combined interest rate cut of 2.25% following 17 quarter point rate increases in as many months.
This was an incredible see-saw of macro-economic policy - gradual rate increases were followed immediately by sudden rate cuts.
In hindsight, the half-point cut in September 2007 was not very dramatic in comparison to the 1.75% in cuts that followed in the next four months. No one then could have foreseen the volatility in the markets that was to come, or could they have?
Why is it that the US Federal Reserve rate policy was reactive to market volatility? Why didn't their monetary policy, which had run up rates from 1% in June 2003 to 5.25% in
June 2006, anticipate the looming housing bubble and bank losses that would surely ensue? Hadn't Alan Greenspan warned of this outcome in 2005? Didn't we all know the housing joyride would end at some point?
Today, we can see banking and financial market data that shows the risk trends in our rear view mirror. Unfortunately, no one has a mirror that forecasts the future, but they could if capitalized risk data were collected on a systemic basis by banks and shared with the Federal Reserve. The Federal Reserve does an excellent job of studying catastrophic risks and running sophisticated macroeconomic loss models on everything from terrorist attacks to coastal hurricanes. The Fed uses this catastrophic loss data to provide capitalize insurance loss reserves for the US economy - ie, they print more money when very bad things happen.
The insurance reserves got tapped after 9/11 and hurricane Katrina, when the Fed injected huge amounts of liquidity into the economy to stabilize markets and restore confidence. Of course, the timing of catastrophic events can't be forecasted, but the monetary response can be estimated based on a variety of risk factors. the fed constantly analyzes and wargames these risk factors and the success of Fed liquidity and monetary responses to 9/11 and Katrina attest to the diligence of their planning and the value of risk-based forecasting models.
What does this have to do with the sub-prime loan meltdown you ask?
Well, if the Fed had non-catastrophic risk-data forecasting models they could possibly pre-empt loss events with macroeconomic policy tools that could even out some of the worst aspects of the business cycle. Unfortunately, that kind of non-catastrophic risk-data has to come from banks, who until recently were totally incapable of providing that kind of data, let alone using it themselves for their own risk-based policy-making.
That's changing. In the last two years banks around the world have been working to assess and collateralize market, credit, and operational risks as part of the Basel II compliance process. That data isn't normalized across banks, and there are wide disparities in how risks are assessed, calculated, and capitalized from bank to bank, country to country. But the raw data, and the beginnings of the knowhow are, for the first time in history, there. And that data and knowhow can be leveraged to provide new macroeconomic tools for Central Bank policymakers around the world.
What's needed are standards in risk assessment, classification, calculation, and the reporting of capitalized risk data from US banks to the Federal Reserve. This may take some years yet to accomplish but the time is right to begin discussing these issues. As US Banks reach Basel II compliance they will be in a position to leverage risk-data for their own self-insurance against non-catastrophic losses, and if they would be willing to share their capitalize risk data they could help the Federal Reserve to reduce market volatility and improve macroeconomic performance for everyone.
Here's a case where regulatory compliance really can improve business performance.
April 11, 2008: Subprime is a Data Governance Challenge
The IMF put out the Global Financial Stability Report last week and it contains a very accurate and sobering description of the systemic failures involved in the Subprime Financial Crisis. It has an institutional focus, and makes some solid observations and recommendations.
The entire report is worth a read, but the Executive Summary contains most of the key points if you just want the meat of the matter:
I will summarize the findings and recommendations that have Data Governance implications:
"The events of the past six months have demonstrated the fragility of the global financial system and raised fundamental questions about the effectiveness of the response by private and public sector institutions. While events are still unfolding, the April 2008 Global Financial Stability Report (GFSR) assesses the vulnerabilities that the system is facing and offers tentative conclusions and policy lessons.
Some key themes that emerge from this analysis include:
• There was a collective failure to appreciate the extent of leverage taken on by a wide range of institutions—banks, monoline insurers, government-sponsored entities, hedge funds—and the associated risks of a disorderly unwinding.
• Private sector risk management, disclosure, financial sector supervision, and regulation all lagged behind the rapid innovation and shifts in business models, leaving scope for excessive risk-taking, weak underwriting, maturity mismatches, and asset price inflation."
What follows are a number of short- and medium-term recommendations relevant to the current episode. Several others groups and for a—such as the Financial Stability Forum, the Joint Forum, the Basel Committee on Banking Supervision—are concurrently developing their own detailed standards and guidance, much of which is likely to address practical issues at a deeper level than the recommendations proposed below.
In the short term...
The immediate challenge is to reduce the duration and severity of the crisis. Actions that focus on reducing uncertainty and strengthening confidence in mature market financial systems should be the first priority. Some steps can be accomplished by the private sector without the need for formal regulation. Others, where the public-good nature of the problem precludes a purely private solution, will require official sector involvement.
Areas in which the private sector could usefully contribute are:
• Disclosure. Providing timely and consistent reporting of exposures and valuation methods to the public, particularly for structured credit products and other illiquid assets, will help alleviate uncertainties about regulated financial institutions’ positions.
• Overall risk management. Institutions could usefully disclose broad strategies that aim to correct the risk management failings that may have contributed to losses and liquidity difficulties. Governance structures and the integration of the management of different types of risk across the institution need to be improved. Counterparty risk management has also resurfaced as an issue to address. A re-examination of the progress made over the last decade and gaps that are still present (perhaps inadequate information or risk management structures) will need to be closed.
• Consistency of treatment. Along with auditors, supervisors can encourage transparencyand ensure the consistency of approach for difficult-to-value securities so that accountingand valuation discrepancies across global financial institutions are minimized. Supervisorsshould be able to evaluate the robustness of the models used by regulated entities to value securities. Some latitude in the strict application of fair value accounting during stressful events may need to be more formally recognized.
• More intense supervision. Supervisors will need to better assess capital adequacy related to risks that may not be covered in Pillar 1 of the Basel II framework. More attention could be paid to ensuring that banks have an appropriate risk management system (including for market and liquidity risks) and a strong internal governance structure. When supervisors are not satisfied that risk is being appropriately managed or that adequate contingency plans are in place, they should be able to insist on greater capital and liquidity buffers.
In the medium term...
More fundamental changes are needed over the medium term. Policymakers should avoid a “rush to regulate,” especially in ways that unduly stifle innovation or that could exacerbate the effects of the current credit squeeze. Moreover, the Basel II capital accord, if implemented rigorously, already provides scope for improvements in the banking area. Nonetheless, there are areas that need further scrutiny, especially as regards structured products and treatment of off-balance-sheet entities, and thus further adjustments to frameworks are needed.
The private sector could usefully move in the following directions:
• Standardization of some components of structured finance products. This could help increase market participants’ understanding of risks, facilitate the development of a secondary market with more liquidity, and help the comparability of valuation. Standardization could also facilitate the development of a clearinghouse that would mutualize counterparty risks associated with these types of over-the-counter products.
• Transparency at origination and subsequently. Investors will be better able to assess the risk of securitized products if they receive more timely, comprehensible, and adequate information about the underlying assets and the sensitivity of valuation to various assumptions.
• Reform of rating systems. A differentiated rating scale for structured credit products was recommended in the April 2006 GFSR. Also, additional information on the vulnerability of structured credit products to downgrades would need to accompany the new scale for it to be meaningful. This step may require a reassessment of the regulatory and supervisory treatment of rated securities.
• Transparency and disclosure. Originators should disclose to their investors relevant aggregate information on key risks in off-balance-sheet entities on a timely and regular basis. These should include the reliance by institutions on credit risk mitigation instruments such as insurance, and the degree to which the risks reside with the sponsor, particularly in cases of distress. More generally, convergence of disclosure practices (e.g., timing and content) internationally should be considered by standard setters and regulators.
• Tighten oversight of mortgage originators. In the United States, broadening 2006 and 2007 bank guidance notes on good lending practices to cover nonbank mortgage originators should be considered. The efficiency of coordination across banking regulators would also be enhanced if the fragmentation across the various regulatory bodies were addressed. Consideration could be given to devising mechanisms that would leave originators with a financial stake in the loans they originate."
New standards and banking practices will clearly be needed moving forward. But we already have most of the regulations we need to mitigate most risks identified in the report. Indeed, one of the great ironies of the crisis is how little Banks used their own fraud and risk management systems to catch underwriting errors and omissions in Loan Origination applications, House Assessments, risk capitalization, etc.
I suspect that the IMF's warning on regulation will not be heeded in Washington, though I do hope regulators will listen to the seasoned advice of some Data Governance veterans because this is a crisis with so many Data Governance challenges.
May 17, 2008: Nordic banks step in to back Iceland
The Financial Times reported today
"Three Nordic central banks unveiled an unprecedented €1.5bn emergency funding package on Friday to support Iceland’s troubled currency and stabilise its banking system as the tiny north Atlantic nation tries to fend off the effects of the global credit crisis.
The plan allows Iceland’s central bank to acquire up to €500m ($775m, £400m) each from the central banks of Sweden, Denmark and Norway in the case of an emergency, the first time the region’s central banks have joined forces to help a troubled neighbour."
This story illustrates the downstream impact of polluted data in the global economy. But of course, for the rest of us not living in Iceland the global credit crunch has impacted our lives in other indirect ways.
Since September 2007, when the US Federal Reserve started cutting interest rates in a drastic program that shaved 3.25% of the Discount Rate in 7 months, the price of oil (valued in depreciated dollars) has increased 50% from $80 to $127 per barrel. Food costs have skyrocketed, and countries around the world are challenged to find credit for government bonds. Inflation, thanks to Subprime, is a growing threat to the world economy and to the lives of poor people living at the edge of subsistence.
But how is this related to Toxic Content and Data Governance you ask?
Well, of course the public Subprime narrative states that Banks invested in fancy hybrid home loans extended to subprime borrowers and created inherent risk in the market that was compounded through exotic derivatives that no one understood. This is partially true, and many banks have since admitted that they had poor internal risk governance.
But there is another part of the story that doesn't attract as much publicity. In 2005, at the peak of the Housing Bubble in the US, Alan Greenspan went before Congress to declare that the US housing market was "frothing." At about the same time, US Regulators decided to relax underwriting guidelines on new mortgage applications for a key segment of the marketplace - self-employed individuals.
Self-employed individuals face a moral hazard when they apply for a mortgage.
This hazard is well known in the residential mortgage marketplace. It occurs when a self-employed individual has to demonstrate their income to obtain a loan. People who are employed by big companies get direct deposit pay checks and have income tax statements which closely match their real income. Self-employed individuals don't get regular pay checks and have tax statements that, shall we say, may frequently differ from real income.
This is especially true for the segment of the population that is paid in cash. Producing documentation of "real" income for these people is a challenge that typically caused the loan underwriting process to take longer for self-employed individuals than employed.
And in 2005, as housing prices peaked and interest rates rose .25% each month, mortgage volume started to decline and for some reason US regulators chose to remove income documentation standards for the self-employed. From that time forward, they only had to make an income declaration.
Case in point. I have a friend who is a mortgage broker. He had a customer who owns a Pizza Parlor and wanted to buy a house. This customer had a good credit score and was a prime buyer. His Loan-to-Value Ratio was good. As a self-employed individual he was paid in cash, and he declared his income to be $10K a month.
But when my friend input the numbers into the super-fast online loan application it turned out that his debt ratio was too high. He had some car loans and credit card debt that put the ratio above 41%, and the loan could not get through. So my friend simply changed his declared income to $12k per month and the loan got approved.
In 2007, what I described above was a compliant business process for a self-employed mortgage loan application. Income only had to be declared, not verified.
In fact, by this time in the marketplace most banks had automated underwriting applications that turned out a rate quote in 40 seconds for conforming rate mortgages. But what was obviously dangerous about this process is that the Pizza Parlor owner made an income declaration without documentation. $10K might have been his best income in his best month in the year. $12K per month might have been his fantasy income. Maybe his real income is closer to $8500 a month.
But now he owns a home with an adjustable rate mortgage that he can barely afford at the current rate he's paying and certainly can't afford when the rate adjusts up.
This is a story that was repeated thousands of times in 2005-7, which is one reason why delinquency and foreclosure rates on those vintages of prime AND subprime loans is at 12-16%.
The fateful regulatory decision in 2005 to relax documentation standards in loan underwriting allowed vast amounts of Toxic Loan Content (poisonously polluted data) to enter the banking system through automated underwriting systems that got their business rules from the regulators. That created systemic risk that was entirely opaque to the MBS issuers, Rating Agencies, CDO issuers, and the marketplace. And unless the credit risk is transparent to investors, the market can't price risk correctly and default is in an inevitable outcome.
By 2005, banks were already aware of rising risk from documented subprime loans and were raising interest rates to collateralize their risk. They just weren't aware of the undocumented risks, which left their reserves deficient to cover their exposures.
But it didn't have to end this way. If regulators in 2005 had just left underwriting documentation regulations in place, or even strengthened them, the Housing market would have seen a soft-landing and the credit crisis would not have happened.
I see quite a few important lessons here for the future:
1. Regulations are not Holy Scripture. Automating Compliance in IT can just as easily automate exposure as it can value.
2. We must learn to measure data quality and validate before we trust it. Data can pollute our businesses, our societies, and our lives and we must invest in methods and technologies to certify its quality on a continual basis to enhance and protect the value of our businesses.
3. The marketplace needs new tools to measure and price business risk. Regulators should not measure risk in businesses and force process changes. This is a reactive and inefficient method.
4. Transparency creates its own rules. Businesses should be required to report and capitalize (self-insure) their risks regularly to the marketplace so that self-regulating market economics can arbitrate between good business stewardship and folly. That arbitration will be reflected in stock prices, which is far more efficient than regulatory sanctions.
We will need many new Data Governance Solutions to help banking institutions across the world adjust to the increased scrutiny the post-Subprime world will bring. But most of all, we will need international forums, like the Data Governance Council, to discuss these issues and bring different perspectives forward because this crisis was eminently avoidable. And it is only through communication that we can develop more mature practices to prevent it again in the future.
July 14, 2008: The US Federal Reserve Needs Data Governance
The US Federal Reserve announced new mortgage lending standards today that are designed to address so-called deceptive business practices among lenders.
Those measures include:
¶Bar lenders from making loans without proof of a borrower’s income.
¶Require lenders to make sure risky borrowers set aside money to pay for taxes and insurance.
¶Restrict lenders from penalizing risky borrowers who pay loans off early. Such ”prepayment” penalties are banned if the payment can change during the initial four years of the mortgage. In other cases, a penalty cannot be imposed in the first two years of the mortgage.
¶Prohibit lenders from making a loan without considering a borrower’s ability to repay a home loan from sources other than the home’s value.
The borrower need not have to prove that the lender engaged in a “pattern or practice” for this to be deemed a violation. That marks a change — sought by consumer advocates — from the Fed’s initial proposal and should make it easier for borrowers to lodge a complaint.
“Rates of mortgage delinquencies and foreclosures have been increasing rapidly lately, imposing large costs on borrowers, their communities and the national economy,” Mr. Ben Bernanke, the Federal Reserve Chairman, said.
“Although the high rate of delinquency has a number of causes, it seems clear that unfair or deceptive acts and practices by lenders resulted in the extension of many loans, particularly high-cost loans, that were inappropriate for or misled the borrower,” he added.
Excellent. Markets around the world can feel confident again that the US Federal Reserve has rooted out the major mortgage lending problems confronting the US Economy and has the entire situation under control.
It is beyond shocking that deceptive lending practices like this even exist in the most "efficient mortgage market in the world" (according to a 2006 IMF Mortgage Market Survey). What's more shocking is that the Fed knew about these practices, had data attesting to their impact on rising rates of mortgage fraud going back to 2005, and did nothing about it until today.
And how do I know that, you ask?
Well, the Fed's own economists put out an insightful summary of what went wrong in the current credit crisis and you can read it here:
The first draft of this report was published in October 2007. And in perfect hindsight, the economists concluded,
"Were problems in the subprime mortgage market apparent before the actual crisis showed signs in 2007? Our answer is yes, at least by the end of 2005. Using the data available only at the end of 2005, we show that the monotonic degradation of the subprime market was already apparent. Loan quality had been worsening for five consecutive years at that point. Rapid appreciation in housing prices masked the deterioration in the subprime mortgage market and thus the true riskiness of subprime mortgage loans. When housing prices stopped climbing, the risk in the market became apparent."
Now the US Federal Reserve would not be the first organization in history to have better hindsight than foresight, but shouldn't we expect them at least to move faster with policy controls when the global credit market is facing the second worst crisis in history? And if it takes the Federal Reserve 2 years to study market data and write a telling report more than three months after the crisis has hit, and 8 months more to digest and issue lending guidelines to restrict fraud in the mortgage marketplace, how long exactly will it take them to react when Hank Paulsen consolidates all financial regulation in their hands?
To me this story offers some important lessons that I do hope Congress recognizes:
1. Regulatory Consolidation is not a panacea. Consolidated beauracracies do not historically produce operational efficiency. Witness the Department of Homeland Security and the performance of FEMA during Hurricane Katrina.
2. Data is useless without people empowered to act. The Fed had ample data to control mortgage lending fraud and prevent the worst aspects of the current credit crisis and it either chose not to act or its internal governance is so poor that there was no mechanism in place to forecast non-monetary economic risks and make micro-policy adjustments.
3. More important than regulatory consolidation, Congress should review the operational procedures and data governance practices at the Fed itself. A GAO Audit of Fed operational procedures and internal, below the Board, decision-making would be a great start!
4. The Subprime Credit Crisis was preventable! The Fed had the data and they had the economic skills to use it. They proved today that they even had the regulatory mandate to affect the changes in lending guidelines necessary.
Congress, the US Public, and the world at large, have a right to know what took them so long to use their own data before we entrust that organization with even more regulatory responsibility.
From where I am sitting, the Fed really needs Data Governance.
September 18, 2008: Enterprise Risk Management is a Myth
The stunning market downturn this week has revealed the breathtaking lack of Enterprise Risk Management in Banks, Brokers, and Washington.
Let me be clear; Risk Taking is not the same as Risk Management.
We've had 7 years of extraordinary risk taking in our mortgage markets, financial markets, tax policies, and war policies.
No one, in Banking or Government, can say with any confidence what their risk profile is, calculate probability of loss, or forecast future exposures.
This makes all claims of Enterprise Risk Management one of the most stellar myths in modern business. Anyone telling you they are doing it is either lying or a rare breed indeed.
I do hope that someone is recording the history of losses at the institutions failing and surviving every day. Because the historical study of that record of loss - and new regulations mandating real Enterprise Risk Management and loss capitalization - is the only thing that will prevent them from happening again.
November 20, 2008: Three Things Basel Forgot
Today, The Basel Committee on Banking Supervision announced a new strategy to address shortcomings in its own global regulatory structure. The proposal creates new capital requirements, leverage ratios, and risk measurements designed to more carefully regulate banking practices across the globe.
The proposal includes the following elements:
# strengthening the risk capture of the Basel II framework (in particular for trading book and off-balance sheet exposures);# enhancing the quality of Tier 1 capital;
# building additional shock absorbers into the capital framework that can be drawn upon during periods of stress and dampen procyclicality;
# evaluating the need to supplement risk-based measures with simple gross measures of exposure in both prudential and risk management frameworks to help contain leverage in the banking system;# strengthening supervisory frameworks to assess funding liquidity at cross-border banks;
# leveraging Basel II to strengthen risk management and governance practices at banks;
# strengthening counterparty credit risk capital, risk management and disclosure at banks; and
# promoting globally coordinated supervisory follow-up exercises to ensure implementation of supervisory and industry sound principles.
Strengthening liquidity and solvency requirements seem like regretful afterthoughts during a time of historically low liquidity and high insolvency, but better late than never. One does wish that the Basel Committee had applied these measures as forethoughts rather than afterthoughts, but that's human nature.
There are three elements missing that I hope to see emerge in 2009:
1. A Global Loss History DB of anonymous credit, market, and operational incidents, events, and losses from every Basel conforming institution. Individual institutions do not have enough loss history to compare their past exposures and "claims" to trend and forecast. Industry and geographic loss information is needed to better inform decision-making at banking institutions. 3rd Party loss data is available to every insurance company for all lines of business. Only the banking community could conceive of risk measurement programs without 3rd party institutional validation.
The Operational Risk Exchange has been aggregating banking loss data for operational risk among the 41 banks who participate in that consortium for 3 years. That model is valid, but the sample size is too small even for ORX. I hope the Basel Committee sees ORX as a valid architype that should be replicated worldwide with each Central Bank collecting the anonymized loss data from each member institution and sharing that loss data worldwide so that all financial institutions can compare their own loss trends to global trends and forecast future exposures more accurately.
2. An XBRL for Risk Reporting Taxonomy. Banks can't report loss events without a global taxonomy so that everyone can agree on what to call things and what things mean when they are reported. Even within banks, the word Risk has many different meanings to different people. For business people, Risk is an omnipresent feature of life, an attribute to calculate potential returns or losses in investments. Many business careers are made by taking risks. For an IT person, Risk is something to be avoided at all costs, the result of flaws in architecture that lead to vulnerabilities and loss. Many IT careers are lost by taking risks.
Business and IT can sit at the same table and have exhaustive conversations about Risk, each thinking they understand the other, and walk away having fundamentally different idiomatic understandings of what was discussed.
That misunderstanding is often a source of new risk.
XBRL (Extensible Business Reporting Language) is an XML language for describing business terms, and the relationship of terms, in a report. It enables semantic clarity of terminology, and that clarity is absolutely essential for the accurate recording and reporting of credit, market, and operational incidents, loss events, and losses.
A Risk Taxonomy is like an alphabet - the letters alone convey no meaning, but they are the foundational elements that allow humans to understand each other. We desperately need a new alphabet to describe Risk - incidents, events, losses, claims, exposures, forecasts, reserves - so that firms everywhere can aggregate loss information, analyze it with standard actuarial methods, compare past exposures to present conditions and opportunities, and forecast potential outcomes to illuminate options.
A year ago, I wrote on this page about the need for new macro-economic tools to enable Central Banks to measure aggregate risk taking in the financial world. An XBRL Taxonomy of Risk is a fundamental building block to enable interoperability and standard practices in the measuring and reporting of risk.
Those standards in turn will enable Central Banks to manage vast databases of loss history and trend analysis that will inform policymakers and member banks to make better decisions that produce better returns. We will still need new information management software and governance models to make sure the right information gets to the right people at the right time, but none of that is possible without a standard alphabet and vocabulary to describe what's being recorded and read.
Recently, I announced an IBM Data Governance Council initiative to develop an XBRL Taxonomy for Risk. We are inviting all interested parties - banks, broker/dealers, hedge funds, consortia, think tanks, and regulators - to participate in this initiative. We will be working closely with XBRL International and XBRL.US to share ideas in an open and transparent process to bring forward a standards proposal quickly. If you are interested in participating, please drop me a line.
3. Lets bring back Glass-Steagall. Gee what a great idea. No leverage ratios, because investment banks can't leverage with bank deposits at all. Banks, Brokerages, Hedge Funds, and Insurance Companies all need to have their activities segregated. It isn't enough to insist on new solvency, liquidity, and risk measures. We need to separate temptation from action. And when all three of these things are done - new solvency requirements to shore up assets on the balance sheet, risk taxonomies and loss history data to forecast future exposures, and Glass-Steagall V2 - we'll have risk tied up in a knot... until it's not.
BM Data Governance Council Leads XBRL Initiative to Create New Reporting Standards for Risk Measurement
DataGovernor 120000GKJR 2,637 Views
ARMONK, NY - 15 Dec 2008: In a move to provide businesses worldwide with consistent tools for measuring aggregate risk in the financial world and provide a real-time view of market exposure, the IBM (NYSE: IBM) Data Governance Council is seeking input from banks and financial institutions, corporations, vendors and regulators to create a standards-based approach to risk reporting.
Today, organizations have inconsistent methods and vague language for disclosing operational, market, and credit risk. These inconsistencies make regulatory oversight both extremely difficult and complex. The first step to enabling new transparency of risk and exposures in the financial services industry is semantic clarity -- a precise method for consistently describing and reporting risk across all organizations. Such transparency could provide a new macro-economic tool and greater fiscal accountability for regulators, investors and Central Banks worldwide, making it easier to identify toxic assets on the books, mitigate fraud, help prevent wide scale fiscal crisis and rebuild confidence in financial systems.
The IBM Data Governance Council is exploring the use of Extensible Business Reporting Language (XBRL), a software language for describing business terms in financial reports, to risk reporting. XBRL could be used to provide a non-proprietary way of reporting risk that could potentially be applied worldwide. It is already widely used for financial reporting throughout Europe, Australia and Japan. The widespread use of this standard ensures adequate skills and understanding among firms and regulators.
"Creating a risk taxonomy using XBRL will provide a vocabulary and a common language allowing everyone to understand what risk means, and that's the first step in making it easier to calculate and report," said Steve Adler, chairman of the IBM Data Governance Council. "When we have semantic clarity around the way organizations describe risk, incidents, events, losses, claims, exposures, forecasts and reserves, it gets easier to aggregate loss information, analyze it with standard actuarial methods, compare past exposures to present conditions and opportunities, and forecast potential outcomes."
According to the Council, an XBRL Taxonomy of Risk could serve as a fundamental building block to enable interoperability and standard practices in measuring risk worldwide. Such standards could potentially enable Central Banks to manage vast databases of loss history and trend analyses that could better inform policymakers and member banks helping to minimize risk and produce better returns.
"XBRL is gaining widespread adoption among global capital markets, banking and securities regulators, and plays an important role in market reforms by contributing to transparency and process enhancements," said Anthony T. Fragnito, chief executive officer, XBRL International, Inc. "XBRL International is pleased to be a part of this important initiative by the IBM Data Governance Council."
The Council is immediately seeking proposals and discussion on this topic to help drive a year-long effort to create a proposed specification for XBRL for risk reporting. Initial discussions about this specification will take place February 26-27, 2009 in New York City at a meeting to be attended by the Enterprise Data Management Council, the Financial Services Technology Consortium, XBRL International, XBRL.US, and U.S. Securities and Exchange Commission staff.
"This is an opportunity for both improving the effectiveness of the risk management function and the quality of reports," said Dan Schutzer, executive director of Financial Services Technology Consortium. "XBRL for risk reporting also holds the potential for cost-reduction through the development of consistent, clear and comprehensive reporting standards."
The IBM Data Governance Council is a group of 50 global companies, including Abbott Labs, American Express, Bank of America, Bank of Tokyo-Mitsubishi UFJ, Ltd, Bank of Montreal, Bell Canada, Citibank, Deutsche Bank, Discover Financial, Kasikornbank, MasterCard, Nordea Bank, Wachovia, and the World Bank, among others, that have pioneered best practices around risk assessment and data governance to help the business world take a more disciplined approach to how companies handle data.
Data governance helps organizations govern appropriate use of and access to critical information such as customer information, financial details and unstructured content, measuring and reporting information quality and risk to enhance value and mitigate exposures. IBM's work in this area supports and furthers the company's Information on Demand strategy, that has delivered results through consistent earnings growth, hundreds of new customer wins, strategic acquisitions and industry-first software offerings.
For more information on the Data Governance Council, visit http://www-306.ibm.com/software/tivoli/governance/servicemanagement/data-governance.html
For more information on IBM, visit http://www.ibm.com/think
Holli HaswellIBM Media Relations512email@example.com[Read More]
In the last five days, a lot of people have asked many great questions that I thought I'd answer on this page to provide a better accounting of what this is all about and what we hope will result.
Q: What is XBRL?A: XBRL (Extensible Business Reporting Language) is an XML language for describing business terms, and the relationship of terms, in a report. It enables semantic clarity of terminology by standardizing a data model - the field names and their relationships - for reporting purposes.
Q: Why Do we need a Risk Taxonomy in XBRL?A: Because Risk measurement, calculation, and reporting are mysterious, arcane, and underutilized business processes in banking and financial markets and reporting standards can demystify, simplify, commoditize risk calculation as a more ubiquitous part of business decision-making.
In the insurance world, risk measurement, calculation, and forecasting are THE BUSINESS. But insurance companies don't tell you what formulas they use to calculate your premium, how they determine their own reserves, or what protocols and methods they use to pay out claims. Actuaries study for years to learn these methods, and very few business professionals - and virtually no IT professionals - have any idea how risk is measured, calculated, and reported.
Q: But what do you mean by Risk Measurement? Don't we need Risk Management?A: Sure. Risk Management is important. But only human beings can manage risk, and before we get there we need to measure past losses, compare them to current events, and forecast potential outcomes. Making a business decision without this analysis is risky. Making a business decision with this analysis is also risky, but when the inputs and decisions are recorded, we have the opportunity to learn from our mistakes and improve over time. We will never eliminate risk, but we can use scientific decision-making techniques to improve our odds.
Today, most people focus on Risk Management. They use qualitative risk assessments to imagine what kinds of vulnerabilities, loss events, and losses may be incurred from business activities. This is a valid method for forecasting and preventing potential losses. But the methods and results vary with the qualitative insight and skill of the practitioner, and they are dependent on disciplined application. Over time, it is very difficult to compare quantitative loss results to qualitative risk assessments.
We can leverage standards in risk measurement reporting to apply quantitative risk assessment to the practices of risk measurement and management so that inputs and outputs have a mathematical foundation. That foundation allows automation, and automation enables ubiquity of application. And that's the purpose of a standard - to enable widespread application and value - so that everyone can measure, calculate, and report risk; without an actuarial degree.
Q: Why do we need risk standards?A: One of the things we've seen in the current Credit Crisis is the ambiguity and confusion about risk. Regardless of whether you are a trader paid to take risks or an IT professional paid to avoid risk, it is nearly impossible to understand the incremental impact of your decisions on your department, your division, your company, your industry, your market, economy, or nation. There is just too much data today and our regulators haven't tooled up to take advantage of the information companies could produce to help regulators and markets operate more transparently.
We know now in dramatic hindsight that incremental risks have systemic impact. People can only understand that impact when they can aggregate the incremental losses in the past, compare them to current circumstances, and make forecasts about the future.
To aggregate and compare risk data, we need standards and XBRL seems to us to be the most logical and effective tool to create those standards.
Q: How could the XBRL Risk Taxonomy be Used?A: These standards will enable more effective risk measurement and reporting within firms, new macro-economic tools for regulators and policy-makers, transparency for financial markets, and a more ubiquitous use of risk calculation in decision-making across innumerable disciplines.
Let me give you an example:
The insurance industry does risk calculation all the time. If you are a doctor, lawyer, accountant, or financial advisor, chances are you buy professional liability insurance. When you apply for the coverage, you tell your insurance company about yourself, your business activities, past losses, claims, and insurance coverage. The insurance company will compare your application to their own database of insureds, losses, and rates.
The insurance company will also compare your loss profile to claims data it purchases from the Insurance Standards Organization (ISO). ISO aggregates loss data from insurance companies across the US and provides anonymous records back to the same companies. Insurance companies need that 3rd party verification of loss data for loss rating and trending. No matter how large an insurance company, and no matter how many years a company has been doing business and collecting loss history, everyone compares in-house data to aggregate industry data. Its a larger statistical sample size and it helps everyone set aside the right amount of premium from each insured for reserves to payout future losses.
We need the same kind of system in the financial markets. It is partially there today. Under the Basel II accord, banks are required to report the amount of gross income they set aside to self-insure against fore-casted losses. But they only report that in the aggregate. No one is reporting the underlying data from which the risk reserves are calculated, and data reporting on that level could have huge benefits.
One benefit is that regulators could compare reported loss information across national and international economies. This would provide enormous new insight into macro-economic trends that could help reduce business cycle volatility.
Another benefit is that banks and financial firms could compare their own loss information to very large samples of industry losses. This would make their own forecasting models far more efficient and that would help everyone manage risks more effectively and reduce paid losses over time.
A final benefit is that markets and rating agencies would gain new insights into underlying exposures in financial instruments and that would enable far more accurate and timely forms of risk rating, making markets more transparent and efficient.
Q: Why is the Data Governance Council leading this standards initiative?A: Because Risk measurement, calculation, and reporting within and between enterprises is not possible without semantic clarity around how we classify, describe, and document incidents, losses, events, formulas, and a host of other terminology. This is a very complex topic, and it is so easy to be confused and confounded by the terminology. Before we can all talk about this topic intelligently, we need a common vocabulary. That vocabulary will enable efficient communication, transferable methods and skills.
And this is very much a Data Governance challenge. The Data Governance Council has been studying these issues for four years and - together with our partners in the FSTC, EDM Council, OCEG, and other organizations - we think we can make a difference with this standard.
Q: Why would organizations want to apply XBRL to risk?A: We can see clearly from the subprime credit crisis that there are still some non-standard methods for appraising risk. We don’t have semantic interoperability to allow us to take an aggregate look at risk across multiple organizations. This makes it hard for companies and regulators to agree on what risk there is and it is difficult to consistently report the risk companies are taking. XBRL can be a tool to help organizations use common standards for the way risk is described.
Q: What benefit would XBRL for risk reporting provide companies and regulators?A: By translating risk reporting into a consistent software language, this will enable organizations to more easily perform advanced analysis, meaningful research and compare risk and loss history among multiple organizations. It could be used for internal reporting purposes or external. Regulators could use it potentially to create a global loss history database of anonymous credit, market and operational incidents, events, and losses from every institution, much like the insurance industry relies upon. XBRL could make risk simpler and more powerful and that should create broad market benefits.
Q: What are the primary obstacles to the adoption of XBRL for risk reporting?A: The real challenge is not in creating a risk taxonomy using XBRL. The challenge is getting agreement upon it and ensuring there is willingness worldwide to use it. That is why the Data Governance Council is seeking input from organizations and regulators worldwide.
Q: Who is supporting this initiative?A: In addition to more than 50 IBM Data Governance Council members, the Securities and Exchange Commission, the Enterprise Data Management Council, the Financial Services Technology Consortium, the Organization of Compliance, Ethics, and Governance, XBRL International and XBRL.US are all contributing to the process.
Q: How far along are you in the process today?A: We have a starter taxonomy that we will begin socializing at an XBRL for Risk Forum on February 26-27 at the Levin Institute in New York. The Data Governance Council’s role is that of a facilitator, seeking proposals and comments to begin defining a taxonomy for risk that can be agreed upon by many organizations worldwide. This work will continue through the first half of next year with a final recommendation expected by the end of the year.