The IMF put out the Global Financial Stability Report last week and it contains a very accurate and sobering description of the systemic failures involved in the Subprime Financial Crisis. It has an institutional focus, and makes some solid observations and recommendations.
The entire report is worth a read, but the Executive Summary contains most of the key points if you just want the meat of the matter:
I will summarize the findings and recommendations that have Data Governance implications:
"The events of the past six months have demonstrated the fragility of the global financial system and raised fundamental questions about the effectiveness of the response by private and public sector institutions. While events are still unfolding, the April 2008 Global Financial Stability Report (GFSR) assesses the vulnerabilities that the system is facing and offers tentative conclusions and policy lessons.
Some key themes that emerge from this analysis include:
• There was a collective failure to appreciate the extent of leverage taken on by a wide range of institutions—banks, monoline insurers, government-sponsored entities, hedge funds—and the associated risks of a disorderly unwinding.
• Private sector risk management, disclosure, financial sector supervision, and regulation all lagged behind the rapid innovation and shifts in business models, leaving scope for excessive risk-taking, weak underwriting, maturity mismatches, and asset price inflation."
What follows are a number of short- and medium-term recommendations relevant to the current episode. Several others groups and for a—such as the Financial Stability Forum, the Joint Forum, the Basel Committee on Banking Supervision—are concurrently developing their own detailed standards and guidance, much of which is likely to address practical issues at a deeper level than the recommendations proposed below.
In the short term...
The immediate challenge is to reduce the duration and severity of the crisis. Actions that focus on reducing uncertainty and strengthening confidence in mature market financial systems should be the first priority. Some steps can be accomplished by the private sector without the need for formal regulation. Others, where the public-good nature of the problem precludes a purely private solution, will require official sector involvement.
Areas in which the private sector could usefully contribute are:
• Disclosure. Providing timely and consistent reporting of exposures and valuation methods to the public, particularly for structured credit products and other illiquid assets, will help alleviate uncertainties about regulated financial institutions’ positions.
• Overall risk management. Institutions could usefully disclose broad strategies that aim to correct the risk management failings that may have contributed to losses and liquidity difficulties. Governance structures and the integration of the management of different types of risk across the institution need to be improved. Counterparty risk management has also resurfaced as an issue to address. A re-examination of the progress made over the last decade and gaps that are still present (perhaps inadequate information or risk management structures) will need to be closed.
• Consistency of treatment. Along with auditors, supervisors can encourage transparencyand ensure the consistency of approach for difficult-to-value securities so that accountingand valuation discrepancies across global financial institutions are minimized. Supervisorsshould be able to evaluate the robustness of the models used by regulated entities to value securities. Some latitude in the strict application of fair value accounting during stressful events may need to be more formally recognized.
• More intense supervision. Supervisors will need to better assess capital adequacy related to risks that may not be covered in Pillar 1 of the Basel II framework. More attention could be paid to ensuring that banks have an appropriate risk management system (including for market and liquidity risks) and a strong internal governance structure. When supervisors are not satisfied that risk is being appropriately managed or that adequate contingency plans are in place, they should be able to insist on greater capital and liquidity buffers.
In the medium term...
More fundamental changes are needed over the medium term. Policymakers should avoid a “rush to regulate,” especially in ways that unduly stifle innovation or that could exacerbate the effects of the current credit squeeze. Moreover, the Basel II capital accord, if implemented rigorously, already provides scope for improvements in the banking area. Nonetheless, there are areas that need further scrutiny, especially as regards structured products and treatment of off-balance-sheet entities, and thus further adjustments to frameworks are needed.
The private sector could usefully move in the following directions:
• Standardization of some components of structured finance products. This could help increase market participants’ understanding of risks, facilitate the development of a secondary market with more liquidity, and help the comparability of valuation. Standardization could also facilitate the development of a clearinghouse that would mutualize counterparty risks associated with these types of over-the-counter products.
• Transparency at origination and subsequently. Investors will be better able to assess the risk of securitized products if they receive more timely, comprehensible, and adequate information about the underlying assets and the sensitivity of valuation to various assumptions.
• Reform of rating systems. A differentiated rating scale for structured credit products was recommended in the April 2006 GFSR. Also, additional information on the vulnerability of structured credit products to downgrades would need to accompany the new scale for it to be meaningful. This step may require a reassessment of the regulatory and supervisory treatment of rated securities.
• Transparency and disclosure. Originators should disclose to their investors relevant aggregate information on key risks in off-balance-sheet entities on a timely and regular basis. These should include the reliance by institutions on credit risk mitigation instruments such as insurance, and the degree to which the risks reside with the sponsor, particularly in cases of distress. More generally, convergence of disclosure practices (e.g., timing and content) internationally should be considered by standard setters and regulators.
• Tighten oversight of mortgage originators. In the United States, broadening 2006 and 2007 bank guidance notes on good lending practices to cover nonbank mortgage originators should be considered. The efficiency of coordination across banking regulators would also be enhanced if the fragmentation across the various regulatory bodies were addressed. Consideration could be given to devising mechanisms that would leave originators with a financial stake in the loans they originate."
New standards and banking practices will clearly be needed moving forward. But we already have most of the regulations we need to mitigate most risks identified in the report. Indeed, one of the great ironies of the crisis is how little Banks used their own fraud and risk management systems to catch underwriting errors and omissions in Loan Origination applications, House Assessments, risk capitalization, etc.
I suspect that the IMF's warning on regulation will not be heeded in Washington, though I do hope regulators will listen to the seasoned advice of some Data Governance veterans because this is a crisis with so many Data Governance challenges.[Read More]
Adler on Data Governance
This morning, EU Regulators announced that they propose to create a Risk Board to monitor financial market performance and systemic risk indicators among the 27 member nations in the European Union. I've advocated a Council approach to risk-based decision-making since the beginning of this year and I think the EU proposal is a good idea in concept. Unfortunately, in Europe it seems decision-making takes a large number of people, becaue the European proposal would have 63 people participating on the Risk Board. A deliberative body with 63 people is not a "Board" - it is a legislature. To complicate matters, "only" 32 members of this board would have voting rights. Unfortunately, the only power they can vote on is a warning to member states that some part of their market performance contains systemic risk. How they plan to determine that threat and get everyone to agree on what it means in any reasonable amount of time is not clear. My guess is that this is a proposal to setup an intra-governmental think-tank that will study issues, write economic reports that no one reads, and only threaten to issue warnings because a vote on a warning will never happen.
Note to Obama Administration: If you want to create a Systemic Risk Regulatory Structure that is guaranteed to fail due to political indecision and lack of authority, copy the EU model.
Boston is America's most European City. Sorry San Francisco. Hills and Fog are not a substitute for Culture and History. The scale of Boston, with its rivers, sea, beaches just beyond the harbor, and easy access to fields, forests, and farms on the periphery make if feel like Hamburg or Stockholm. I've been to Boston a dozen times, but most often for just a day. I know Logan very well. Last week, I discovered Boston for more than a day.
A great city! Who knew nice people lived in a civilized city North of New York?! <mock sarcasm>
On Tuesday, we drove up from NY and discovered that Boston is a six hour drive up during rush hour and a 4 hour drive back before. On Wednesday, I was a speaker at the MIT Information Quality Industry Symposium in an afternoon session lasting 40 minutes. I arrived a lunch and found the event starting just after in a square building on Amherst Street on the Cambridge campus. About a hundred people filled a large tiered classroom. Many familiar faces and some old friends. We traded "what's up" stories in the lobby on low black sofas while chowing on salty sandwiches and chips. The US Army was the keynote speaker and some army chaps were in the lobby talking about Army things. Data Governance Aficionados were comparing the US Army to the British Army, who had just won a Data Governance Best Practice Award at the Wilshire Conference in San Diego last month.
Funny coincidence...all ideas are derivative...
My presentation was about www.infogovcommunity.com and The Six Easy Steps to Smart Governance and it was very well received. I like to present. Doesn't matter what mood I'm in before I stand up I always step up about three steps higher when I start talking. Its the audience that feeds me. Not the adoration, center of attention. I need the feedback. Every time I present I learn something new from the audience and its that interaction that makes presentations so much fun for me.
On Wednesday, my audience gave fantastic feedback and it took me all weekend to process what I learned.
Information is a Tool.
Wow. I can't tell you how many "Information is an Asset" presentations I've sat through where some IT Architect is trying to persuade the audience and herself that Information is an Asset with a value that can automagically be calculated. Someone out there is working on fantastic formulas that will produce THE ULTIMATE INFORMATION ROI CALCULATION and win a Nobel Prize.
Ain't gonna happen. Here's why:
1. Value is dependent on price. Information has a value when there is a pricing mechanism and a market in which it buyers and sellers can interact. Movies, Music, News, and Software are all examples of INFORMATION that is sold with prices in markets. Economists have already developed pricing formulas for consumer behavior in markets. Cobb-Douglas Utility Theory captures these interactions nicely. In a market, both buyer and seller benefit so outcomes are equal.
2. Corporations have no internal markets. IT professionals are mostly eager to assign value to Information because Applications and Information are the primary work products of their lives and they want their life work to have meaning beyond their jobs and paycheck. But without internal markets for buyers and sellers to establish pricing mechanisms, Corporations can't assign anything but abstract values to information.
3. IT uses Unit Cost of Labor (Thank You Karl Marx) to assign the value of IT work products. The Unit Cost of Labor identifies the human contribution to value creation.
Information is an Asynchronous Asset and it doesn't have to be right to be valuable.
IT professionals are so hopelessly enamored with "The Single Source of Truth." IT is a belief system but that doesn't mean that verified information is always valuable.
Fact is, quite often lies are just as valuable. Two examples:
1. In the old days, The Department of Labor compiled monthly unemployment data based on the percentage of the workforce that wasn't working. That made sense. Unemployment means "people who want to work but can't find work." But in the 1990's the standard was changed to include only the people filing for unemployment benefits each month. This rate excludes members of the workforce that are working less than 20 hours a week, people who have stopped filing their weekly claim for unemployment benefits, the elderly, and those who have dropped out of the workforce entirely. So naturally, the new number is much lower than the old number. How low? The current rate of unemployed is 9.5%. However, if you include those working less than part-time, those who recently stopped filing for unemployment benefits, and those who dropped out of the workforce entirely the real rate of unemployment is 22%.
What's the difference? 9.5% is a recession. 22% is a depression. Information is a tool used by policymakers to achieve a goal and the outcome is not equal.
2. In May 2003, Ebay restated its earnings from 2000 and 2001 but didn't tell anyone. It appears that someone in the accounting department "discovered" a $127 million loss both years and retroactively restated earnings. They hid the restatement in their SEC filings. From a "Single Source of Truth" perspective, one could argue that the restatement demonstrates the value of trusted information. But I don't think that's the truth. I think the reporting of lower losses was a GOAL of ebay and the chart shows that the under-reporting had the effect of protecting the stock from significant declines during a recession. The truthful reporting of the losses during the bull market of 2003 had no negative impact on the stock. So it looks like ebay hid the truth when it benefited them and revealed the truth when it couldn't hurt them.
And who could blame them... after all using Information as a Tool to achieve policy goals is the whole point of Governance. And this is where I say to my IT friends that you won't be successful with Data Governance if you don't give up the hopelessly naive belief that a single source of the truth is a the goal of Data Governance.
Data Governance is a Business Process
The Goal of Data Governance is to achieve business goals - cutting costs, improving revenue, reducing risk. As we've seen above, the information doesn't always have to be "right" to achieve these goals. That's why Data Governance is a business process and not an IT process.
Try to make Data Governance into an IT process like some sort of application development lifecycle and you will fail. Not because the process is wrong. Because the assumptions are wrong.
Human Nature is at the Heart of it
This week, my wife and I visited the Bank. Its amazing how defensive retail bankers are these days when talking with their customers. And they should be! Money is free and these guys are charging nearly 5% for mortgages for the best rated buyers. But beyond the mortgage discussion, our friendly banker brought a good idea to us - Debit Cards for our teenage sons. It teaches them responsibility with money, he said, how to budget with what they have. And of course it gives the bank two new debit cards that earn small fees with every purchase, not to mention ATM fees at other banks... But of course, Debit Cards are fact of modern life and as much as we'd like to keep our kids kids and not indulge them in the consumer culture of America yet we need to be modern parents too.
So we brought the idea home during the Saturday BBQ dinner in the backyard. "We went to the bank today..." the conversation began. "And we are thinking about getting you both Debit Cards..." At the banking bit, my kids started paying more attention to their burgers than us. Banks are boring. But as soon as the Debit Card idea surfaced, WOW! My kids know what Debit Cards are - its a BENEFIT. As soon as the conversation turned to a BENEFIT for them, they were alert, animated, inquisitive. They wanted to know how would it work, when would they get money, how much, how often, what happens when the money runs out, where can they spend it, how do they get it?
How much was the big topic. Kids, all kids, are smart. They began negotiating from the getgo. My wife and I hadn't talked about how much, and they knew it. They wanted to hear what we were thinking. How much? Ben, the oldest, wanted to set the floor for negotiation. "Just what are you thinking?"
Net: When benefits are at stake in any discussion, negotiations are competitive and you have to arbitrate between self-interest and the common good. Because you can't afford how much the other party WANTS because WANTS are infinite.
That's where Governance comes in. You compare the situational needs of each party to sustainable goals of the program and you make a decision. Based on the goals. In a business process. With Six Steps.
Information is a Tool. When you use it its an asset for YOU. Not always for the other guy.
Thank you MIT.
DataGovernor 120000GKJR 2,631 Views
Letting the bottom fall out of the economy is a worse idea, but quickly passing a colossal $700B+ economic bailout package as currently drafted is also a big mistake.
A few quick reasons why:
1. No one knows exactly how large the bailout package will ultimately be. $700B?Don't believe anyone who tells you they know what it will cost. And this administration has historic problems with math. They originally calculated the cost of the Iraq War at $50B, and we know now it's closer to $1T. Whatever they predict now, make sure to multiply it times 2 or 3 to be safe.
2. This is the administration that relaxed regulations and got us into this mess, and now they want emergency powers to assume $700B - $2T of toxic debt and put it on the US Government balance sheet. The last time they wanted emergency powers, we got The Patriot Act.
3. The current proposal from Hank Paulsen concentrates all power over this extraordinary bailout into the sole hands of the Secretary of the Treasury, with minimal oversight from Congress. It doesn't require even a report to Congress on the status of the bad debt and program function until three months after the start of the program - read, post election.
Now I don't think Congress knows much about how the US economy functions, but they have good staff and the GAO reports to Congress. Giving this much power to a power-hungry and secretive Administration just seems to me to be a bad idea.
I'd recommend a few Data Governance additions to the current legislative proposal:
1. Audit & Reporting - The assumption of whatever trillion dollars of toxic mortgage content should be done with full market transparency. The administration should be required by law to publish each mortgage transaction to a public trust website where everyone can see all the details of each transaction. The GAO should be required to audit the process on a quarterly basis, and Treasury should be reporting on progress to Congress bi-yearly.
2. Governance - A bi-partisan commission of Senate and House members should provide policy oversight on the program, with the Secretary of the Treasury acting as Chairman of the Board.
3. Stewardship - A special office within Treasury should be setup to manage the operation of the program, accepting policy leadership from the Governance Committee.
4. Data Quality and Provenance - These toxic mortgages are extremely complex to understand en masse, but individually each has a story to tell about underwriting incompetence, business negligence, homeowner fraud or ignorance. Those stories should be told because this unraveling of what went wrong is potentially the largest database of credit losses in the world. Each case should be researched and the debt narratives preserved.
5. Risk Calculation - The aggregate losses here should be profiled in a database and preserved as an historic repository of credit risk and loss - available to every lending institution in America to benchmark and calculate their own risks.
6. Value Creation - each MBS and CDO purchased at below-market rates should be monitored to determine how much ROI they return when market conditions return to normal. The American people have a right to learn that information, and Congress has a right to determine how those proceeds should be used - hopefully to pay down the US National Debt which will soar well above $11T thanks to this bailout!
These Six Things are not in the current legislation Hank Paulson sent to Congress today, but I do hope someone from Chris Dodd's or Barney Frank's committees are reading my blog and take note.
If we don't take this opportunity to make this bailout process transparent, we will have another mess to clean up later. Human beings are like that - they don't behave correctly when there is no accountability. That's exactly the lesson learned from this crisis - so why would Treasury repeat that mistake in this bad bailout legislation?!
Answer: because people never think it will happen to them.
DataGovernor 120000GKJR Tags:  senate systemic agriculture risk data bookstaber governance sec cftc 4,863 Views
Agriculture is not the first word that comes to mind when contemplating systemic risk regulation, but the Senate Agriculture, Nutrition, and Forestry Committee was the gladiatorial arena for systemic risk regulation of derivatives last week. Agricultural commodities are traded on the Chicago Mercantile Exchange and the Commodities, Futures, and Trade Commission (CFTC) regulates commodities trading, and the Senate Agriculture Committee oversees CFTC. A week ago, the Senate completed nomination hearings for Gary Gensler, the new CFTC Chairman. Gary's nomination was approved unanimously by the committee, and his participation in the hearings last week on "Regulatory Reform and the Derivatives Market" was his 8th day on the job. But judging by his testimony performance, it is easy to see why both Democrats and Republicans love him. He's smooth, diplomatic, and combines left and right positions in the same sentence. Other expert testimony came from:
Ms. Lynn Stout
UCLA School of Law
Los Angeles, CA
Mr. Mark Lenczowski
J.P. Morgan Chase & Co.
Dr. Richard Bookstaber
New York, NY
Mr. David Dines
Cargill Risk Management
Mr. Michael Masters
Masters Capital Management, LLC
St. Croix, USVI
Mr. Daniel A. Driscoll
Executive Vice President and Chief Operating Officer
National Futures Association
Lynn Stout and Michael Masters presented populist, anti-establishment, arguments for regulatory reform. Mr. Masters has impressed me in the past with his presentations on derivative markets, and in his testimony he pushed hard for notional derivative clearing and exchange trading. Mark Lenczowski and David Dines toted the bank party line on the need for choice in derivative markets, the complexity of the OTC market, and the extra costs standardization of derivatives would add to transactions. Rick Bookstaber made some reasoned and logical remarks about how easy it would be to standardize derivative trading and why it would be desireable to put it into an exchange. He said that the opacity of derivatives makes them the weapon of choice for gaming the regulatory system, that banks use them to acheive investment goals that hide leverage, skirt taxes, and obfuscate investor advantage.
The key battle positions now are:
Conservative: Leave things as they are with greater capital and margin requirements, some transactional reporting. The banks contend that exchange trading is an option in today's market but that customers should decide whether they want to buy derivatives on exchanges or via OTC. Banks already face Capital and margin requirements on derivative trading, so new limits would largely impact non-bank derivative market players. An enhanced status quo seems unlikely, and I think the banks know this and thus are taking this position as a negotiating tactic to limit the Moderate choice.
Moderate: Force derivative trading into clearing houses, require capital and margin requirements, set new position limits on holdings, and use TRACE to track market transactions. This is the essence of the Geitner proposal and Mr. Gensler espoused this position eloquently. I also believe that the banks are comfortable with this solution, because they created the clearing houses and have enormous influence there. The new capital and margin requirements would make benefit the 14 primary broker dealers and if the banks are going to give up some opacity through clearing houses they want at least to ensure a cartel status for derivative dealing. Because Gensler and Geitner are already on board with this, and bank lobbyists are behind their support, I see the moderate option the most likely.
Liberal: Force derivative trading into an open exchange in which all transactional volume, price discovery, bid/ask, etc is fully transparent. This option creates the greatest market efficiencies and allows any dealer of any size to participate in a very liquid and open derivative market. In the beginning, there would be some semantic challenges packaging bespoke derivatives into mass-customized and standardized products. But the data models and technology exists to perform these data gymnastics and the industry would, over time, become adept at provide customized derivative products in standard offerings. In an exchange, it is harder for banks to game the system, and the benefits of derivative trading are more widely shared. Thus, banks want to avoid this. Unless Obama comes out in favor of exchanges, I see the Liberal option falling to the bank cartel.
The challenge with any of these scenarios is enforcing positional limits. CFTC, and the Senators, want the regulatory power to impose position limits. This would entail positional reporting and some kind of kick-back function at the clearing house or exchange to limit registered broker/dealer transactions. But the technical solution has some complexities not obvious to the untrained senatorial eye...
A derivative position is not the same as an equity position. When I own two shares of IBM Stock, they are two units of the same instance. When I own two XYZ currency swaps with the same maturity date, they are two instances of the same unit, and they may also have other characteristics that make them different. It is not possible to add up all the derivative units at the end of the day and compare them in the same way as you might with equities. You have to record each transaction and tally up the common elements, and then you need to analyze all the composite positions to determine what they mean.
One imortant thing that all the panelists missed is the fact that it is not possible to standardize derivative products, per se. It is the components and their semantic definitions that can and must be standardized. That is, a Chevy and a Ford are both cars but they are different types of cars. Yet both have standardized components (often made by the same parts suppliers) that make them subject to classification and their functions interchaneable. We need the same kind of classification of derivative components, so that every buyer and seller can set the features they want for the financial goals they have.
By standardizing derivative components, and plugging them into a configuration engine, it will be possible for an exchange to offer customizeable derivative products to any buyer and seller in the same way as banks do today via the OTC market. The conditions may vary, but the components will be interchangeable. This is the dirty little secret banks don't want anyone to know. Because when exchanges can offer mass-customized derivative products, the huge transactional fees that banks derive from the opacity of risk will evaporate...
A few months ago, the big talk in DC, NY, and among academic circles was that the CFTC would get merged into the SEC, and that the Fed would assume responsibility as the systemic risk regulator. I think that talk is now dead.
Last week, Mr. Harkin, Chairman of the Committee, and Mr. Chambliss, the ranking republican, made many mentions and requests of Mr. Gensler on his resource requirements for regulating derivatives in CFTC. Mr. Gensler mentioned that the CFTC is woefully underfunded, with only 570 people on staff, and the commission would have to double in size at least to manage the complex derivative market. Harkin and Chambliss made it quite clear that Mr. Gensler would be getting new authorities and new funding, signaling to Treasury that CFTC will remain independent and overseen by Harkin and Chambliss in Senate Agriculture, thank you very much.
Power being what it is, the deck chairs in Washington will not be changed. Systemic Risk will be regulated in parts and pieces. I predict we have Systemic Risk Governance Councils in our future and that all the major regulators will get new authorities, new funding, and oversight from the same crusty old men and women in Congress who failed to oversee and fund them correctly prior to the crisis...
If you have a Data Governance program today you already know its easier to start one that do one. Real governing is not like a Hollywood movie. Its hard to know what's wrong, why its wrong, how to fix it, and how to get people to care or follow the fixes. And you have to do this every day and all the gurus tell you to get metrics and KPI's, build a framework and follow my process. But those gurus don't live your life, they don't work in your space, and they don't have to make tons of messy compromises to get things done.
But you do, and you know that Governance is tough stuff.
In the Data Governance Council, we know that too and we want to help. We helped build the market with the landmark work we did on the Maturity Model. That gave you a way of knowing that what your already know isn't enough. You could use it to help others realize it wasn't enough too. And that gave you a place to start your program.
Well, now that you are in the thick of it, we think there's a way to communicate how your organization really works - to simulate your environment so you can help folks learn what's going on, how stuff gets done, and what would happen if you made some changes. We know you do that anyway, all the time. But we want to help you do it in a safe test environment before you put your ideas into production.
We call this Predictive Governance - the SCIENCE of describing the world as it is to run simulations on how we'd like it to be. Normally, most folks do it the other way around... The simulate the way they think the world works so they can describe how they want it to be...
Now I could tell you all about how this new way of working is going to look, how its going to help you, and what its going to do. But its more powerful if you see it for yourself. What I'm sharing with you today is an early preview into the Predictive Governance Simulation we are building. Its not pretty or polished, but it works and you can play with it now.
Have a look and let us know what you think:
If you'd like to join the IBM Data Governance Council and help us do more with this, drop me a line.
In 2004, when I hosted the first Data Governance Forum at Mohonk Mountain House, I had three teams of IBMers developing the narrative discussions for three tracks on a common use case. The tracks were called "Infrastructure," "Policy," and "Content." The use case was "Data Supply Chain." The Forum had two days of meetings stretched across three, starting in the afternoon of the first, going until lunch of the last. On the only full day, we hosted the three breakout meetings, and each team worked to integrate their track discussions around the use case. The use case came from some business process definitions software group had developed for business component models, something to do with insurance claims processing. As it turns out, we had only one or two insurance companies at the event, and we spent more time focusing on the track headings and business process model than on the idea of a Data Supply Chain. A conference is always the product of the people and the ideas in a room, regardless of what one puts on the agenda. And at this first event, when most of us only had the most vague understanding of what "Data Governance" was or could be, business processes were familiar and Data Supply Chains were distant.
Three weeks ago, I hosted another Data Governance Forum at Mohonk Mountain House. It was again two days of content stretched across three, and again a very diverse group of people came together to produce discussions that were engaging, powerful, and divergent from what was planned on the agenda. In three breakouts on "Data," "Risks," and "Governance," the panelists and audience exchanged ideas and I ran back and forth between the breakout rooms to listen, learn, and occasionally drive the conversations. What I heard among talks about Data as an Asset, Risk Taxonomies, Governance models, and Security & Privacy, was the loud echo of Data Supply Chains reverberating off the walls. It was like an archetype of the first meeting, the temporary suspension of historical time, as if in all these years of Data Governance we had lost the original truth, like a spring disappeared under the ground, rediscovered at the source.
Every company does Data Governance today, for ill or good, with intent or dystopia. Every company also has at least one, but often many more, supply chains. These are real supply chains that may only stretch across one or two towns or six continents. Supply chains link producers, distributors, and consumers. They enable outsourcing and resourcing. And they are a fixture of modern business since business became modern in the mid-1970's. And with disciplines like Six Sigma, large multi-national supply chains enable massive economies of scale with quality control that previously were only available to the largest organizations with fixed multi-year labor contracts.
Today every organization also has large distributed Data Supply Chains. Some parts may be automated, others batch, and still others quite labor intensive. The variety and function is often the ugly mess the CIO would not like the world to see. And with seldom exception, they are not "governed" with anywhere near the same quality control and rigor as are real supply chains. When an oil company puts down a new oil terminal, well defined engineering processes are used to map out every step of production from well head to refinery. If Data Supply Chains are intended to capture the same kinds of flows with information, the methods used are mostly ad-hoc, one-off dependent upon project leader, never to be repeated again. And the result today is that companies have tens, hundreds, and even thousands of ad-hoc supply chains designed individually, some existing in their original state for decades. The disconnects create massive inefficiencies, quality control problems, and functional friction.
What every company should be doing is inventorying their existing Data Supply Chains and begin re-engineering. There should be one Data Supply Chain engineering standard. And each new real-world supply chain should include a well defined process to create a logical and efficient Data Supply Chain that monitors itself. This is not a small undertaking. But we can't create a Smarter Planet full of sensors and instruments to monitor the changes in our real world if we do not also monitor and instrument, standardize and re-purpose, the changes in our own enterprise.
Every time I speak to an IT audience, I ask "What is Data Governance?" Of course the audience has come to hear me tell them the answer if they do not already know. But I'm more interested in what my audience thinks. Invariably, the answer has words like "Policy Enforcement," "Control," and "Compliance" in it. And to me what this reflects is a desire among IT professionals to expunge chaos and confusion from their world and create order, stability, and simplicity. Perhaps this is very human, but I think our desire to transform complexity to simplicity focuses far to much energy and attention on the world "To Be," or perhaps even on the world "Never-To-Be."
I think we need to spend more time focusing on the world "As Is," the one with dirty, grimy, confusing, and complex Data Supply Chains that are not yet instrumented, monitored, or in any way Smart. It is this world that needs the bright white light of assessment, discussion, policy, implementation, audit, and dynamic steering. This dark and dishonorable world "As Is" is the past most of our Data Governance programs struggle to change in the present with business plan funding for the future. It needs new methods that monitor the information flowing through its electronic veins, real-time auditing of the tools that are used to change it, and brand new business intelligence solutions that analyze past performance, compare them to current conditions, and predict blockages and failures.
In 2009, at the Mohonk Mountain House, back in the place where it all began, surrounded by 52 Data Governance Thought Leaders, I saw again the source that we mistook all these years - Data Governance is a quality control discipline for the Data Supply Chain.
Fix the world that is.
Since the 18th Century, Freedom of Expression has become enshrined in constitutions around the world as a Basic Human Right. It defines Democracy in its defense and Dictatorships in its assault. People like to control and don't like to be controlled, and the tension between controlling and being controlled requires this Human Right to be defended and re-defined every year. Sometimes, like during the McCarthy Era in the United States, the tide turns against Freedom. Other times, like in the Middle East today, the Freedom to speak changes the course of history.
But there is another Freedom not yet defended as a universal Human Right that should be and it is the Freedom of Information - the right to be informed, to learn. This right is implied by the Freedoms of Press and Speech, but it is not articulated explicitly as a constitutional right. Around the world, many nations have Freedom of Information Acts that require national and local governments to make information available to the public. Those acts were created when widespread access to information was rare. Libraries and archives were places where large amounts of information could be physically retrieved and governmental disclosure was paper-based. Universities and Governments were the largest aggregations of information, and they were the places you visited to get information.
But today, with the Internet, human beings have potential access to information without physical limits and it is that potential that must be enshrined in law as a basic human right. Every human being on the planet should have the right to access information freely and without threat of harm. Like Free Speech, that right should be defended even when the content of information accessed are heinous and injurious to some. Any society or nation without the Freedom of Information as a basic human right is a place that can be controlled and manipulated.
According to Human Rights Watch, there are 40 nations around the world that restrict access to the Internet or Social Networks. Many of these nations also block satellite TV and other forms of communication. But even in Western Democracies, Information Access is controlled by cost, technology barriers, labor protections, and secrecy laws. Even the most advanced nations have huge regions without access to the Internet. And some nations now seek to tax content flowing over the Internet as a means to restrict trade and favor local providers.
This is not a question of commercial competition. This is a question of human progress. Where there are people unable to access information freely there are opportunities for oppression and abuse. Democracy and Freedom will not thrive or survive without the Freedom of Information. To be ill-informed and speak freely is a condition of intellectual slavery.
I believe that we must work to assert the Freedom of Information as a basic Human Right. It must be a 21st Century Goal to connect every human being on the planet to high quality trusted information. There should be no technical, political, cultural, or economic barriers to Information.
It should be as easy as air and as cheap as water, taken for granted and governed by statute in every nation around the world.
On October 7-9, I will be hosting a conference on The Future of Data Governance at the Mohonk Mountain House (www.mohonk.com) in New Paltz, NY. This event has been designed to explore the challenges and solutions of Data Governance organizations constantly ask about:
1. How do I transform data into an asset? Data isn't an asset until you make it one, and its not an asset like gold, stocks, or oil. Those assets have commodity values based on their scarcity and demand. Data is an asset with infinite availability, so its value can't be based on the amount you own or the amount someone wants. The value of data is purely perceptional, unless there is a market for that data. iTunes, DVDs, Newspapers, and cable TV are all examples of data with values based on market demand through external sales channels.Many organizations in the Data Governance Council have been successful in creating information assets, protecting them from risks, and organizing x-functional participation in Data Governance Councils. And they have achieved some stunning results.But internally, we have no market for data sales. So the best we can do within an enterprise is increase the perceptional value of data as an asset. It has a perceptional value to Business when IT can demonstrate incremental revenue obtained through data consolidation, aggregation, cleansing business intelligence, and new sales.
Five years ago, Mohonk was the venue where I hosted our very first Data Governance event. Back then we organized three tracks to focus on Policy, Content, and Infrastructure questions. We had a lot of questions and ran each track as an interactive forum to frame common issues, understand the dimension of Data Governance, and identify convergent areas our customers wanted to explore. We had long discussions about data supply chains, policies and rules, metadata and data classification, security and risk. The dialog was extremely interactive, and coming out of that meeting there were many who wanted to continue. That was the genesis for the IBM Data Governance Council.
We knew then that Data Governance would become an important field. Some early visionaries like Robert Garigue from Bell Canada, Christa Menke-Suedbeck from Deutsche Bank, Charlie Miller from Merrill Lynch, Ed Keck from Key Bank, and Richard Livesley from Bank of Montreal helped us all to see the dimensions of the emergent market. And it was those leaders who helped to shape the Data Governance Council Maturity Model, which in turn helped define the elements of the Data Governance marketplace.
Of course, what we couldn't see then is how failures in Data Governance would threaten the world economy itself. The Credit Crisis was caused by incremental policy failures in almost every stage of the mortgage data supply chain. Loose credit led to bad home loan underwriting decisions, which were masked by rising home values. Huge fees in MBS and CDO trading led to inside-deals with credit rating agencies and banks and vast amounts of poorly documented mortgages came to be regarded as Tier 1 assets on many balance sheets around the world. These instruments were insured by complex derivatives traded without clearinghouses and created interconnected obligations among the largest banks with huge exposures should any one of them fail.
The media has focused on the wide segment of the funnel, the derivative market failure. Credit Default Swaps in this market had a notional market exposure exceeding $100 trillion. But the failure was within a supply chain and poor underwriting standards in loan origination from 2005 to 2008 continue to pollute banks with Toxic Assets and the long tail of mortgage foreclosure haunts our economy. Our mortgage market remains heavily discredited around the world and new Data Governance solutions are needed to restore investor confidence in the US Mortgage Market.
I've been working with a range of policy-makers and thought leaders on providing concrete solutions to those challenges, and I will host a round-table discussion on US Housing Data as a use case example on the value of data, the terrible risks that can still plague our economy from data pollution in that supply chain, and the concrete steps that can be taken now to address these issues.
I think this conference will be thought provoking and practical. The market is looking for Data Governance solutions. Not just know-how and not just software. But know-how and software and examples how to apply them. That's what we'll do and I hope you can join us. I think it will be the best Data Governance Conference ever. The venue is fantastic, the room rate unbelievable, and the conference fee is a true bargain.
This agenda will continue to evolve, so come back often for updates.
Directions to Mohonk
Temporary Suspension of Disbelief.
What I'm going to tell you know may seem like a fairy tale, but I want you to calm down and listen carefully. The world we know now in 2012 will seem a distant memory by 2022. The Big Data we have today will seem quite small and quaint in 2022. Because on the horizon today we can already see the emergence of the future in which all business decisions will be simulated before committed to action.
Everything. Manufacturing, Pharmaceuticals, Government Programs. It will all be simulated in detail and richness that will be fun and immersive like the best 3D video games.
Spreadsheets will go the way of the Dodo bird - extinct. And everything we do in the real world will have a Data Model, and those models will form the basis of all value creation in the world. Let me give you a few examples of what I mean:
Today, manufacturing is a highly automated process that happens in discrete phases using raw materials, labor, machinery, large physical locations, logistics, and supply chains. Its possible to design a product in San Rafael, make a prototype in China, and go to mass production in Thailand, serving markets in North America in under six months. That's fast by historical standards for simple products like eReader binders, smartphone cases, and even decorative objects. But its also a highly energy intensive application. Product prototyping relies on remote manufacturing and postal services to transmit the prototype and design ideas, shipping to transport goods to market ports, rail and trucks to deliver to stores and customers.
But in the next decade, that long supply chain will be transformed by 3D Printing, which will enable intricate product manufacturing in every home. A 3D Printer uses lasers to fuse layers of metal, polymer, and other materials into highly intricate products in hours. You can create products in 3D Printers that are impossible to create in any other way. They can have moving parts and can be printed disassembled or assembled. And this capability will turn every home into a factory.
Need a new part for your washing machine? Print it. Want a cup with four spigots? Print it. Develop a new gas/diesel hybrid motor with a hydrogen battery backup? Print it.
In the future, online retailers like Amazon will sell vast catalogs of digital product designs that individuals create and license under copyright for local 3D Printing. Open-Source networks will share designs under Community License terms. And designers will create their own catalogs of digital assets that have astronomical value protected by patent. 3D Printers are available today for about $2000, and with one of those you can create polymer based products in about 3-8 hours depending on the complexity. By 2022, we will all be able to purchase 3D printers that can create new products with hundreds of intricate moving parts, layer by later, in under an hour. And a new supply chain of raw materials shipping, and recycling, will replace the goods delivery supply chain of today. Of course, you won't yet be able to print a car in your garage in a day, but you will be able to print a new steering wheel, brake pads, or chrome-plated exhaust for your 1949 Studebaker.
Of course, this assumes you can trust the data. Trusting sales data in a spreadsheet is one thing. But trusting design data in a brake pad that will or will not stop your car in an intersection is quite another form of trust. And 3D Printing won't just be used to print locally what we produce today. It will also be used to print things we can't produce today, things that are incredibly complex. And because in the beginning the cost of printing and risk of failure will be high, all designs will be simulated online before they are created.
The simulation environments will look like 3D First Person Shooters or MMORP Games like World of Warcraft, in which new product designs are created in data first and tested in simulation environments with live people acting in Avatar roles. These simulations will be created to mimic the complexities of the real world, and human beings will use them for everything. You won't buy any digital designs to print locally without a simulation certificate. And that brings us to our next area,
In 2022, we will look back at drug trial testing on rabbits, monkeys, and humans as being incredibly brutal and wasteful. Why would someone subject another living being to a painful experiment without first testing the drug or remedy in a bio-data simulation? In 2012, we can already see and describe our world at the atomic level. We can create memory arrays with just 12 atoms, and manipulate cells and molecules. By 2022, we will be able to simulate organic interactions at the sub-cell level. New drugs will be tested in 3D simulations of simple and complex organisms many times before they are ever tested on live beings. It will be far faster, cheaper, and easier to test new drug ideas on artificial environments with detail on par with real organisms than risking injury and death in live subjects.
Want to develop a new pesticide? Test it in a corn or wheat simulation, including the entire bio-ecosystem of a corn or wheat farm - insects, birds, people, wind, rain, earth and sky. All included. Test first in the computer, refine and retest, refine and retest. Then try outside. Its already possible to simulate some simple organisms, but this area will grow rapidly. And in the future new drugs and chemicals can be tested and developed in weeks and months instead of years or decades. Drug companies will store huge libraries of simulated drugs, environmental simulations, and the outcomes for audit and review. Labs will constantly compare simulation results to real drug trials to refine the simulations and learn from mistakes.
A program is a policy. That's neither a product nor a chemical. Its a set of rules designed to influence or change human behavior. Cut a tax and you subsidize a program. Write a new law, and you create a new restriction. Declare war, and who knows the outcome. In 2022, each of these decisions will be simulated in large online simulations with millions of people, miles of territory, buildings, cars, planes, trains, and all. There will be online simulations with people interacting as avatars (like Second Life), and anyone with an idea will be able to test policy assumptions on that population, changing the size and scale of democratic participation in ways we can't fully imagine today. No one will ever think about testing out a policy on the population without first testing it in a simulation.
Want to build a battleship? Simulate it. Want to put a Naval Simulation in the Pentagon? Simulate the design of the design. Want to audit the Federal Reserve? Simulate it.
Where are we today with simulations? We have some rudimentary vocabulary for taking complex human and machine interactions and reducing the complexity to simple simulations everyone can understand. Like a lot of things, the simulation industry has become popular by dumbing itself down and making its simulations consumable.
Where do we need to go? We need to develop new methods and vocabulary to capture human knowledge of complex ecosystems and transform that knowledge into equally complex simulations that convey understanding of the most feature-rich and intricate environments.
In 2012, we have Big Data. In 2022, the world will be Data-Driven. All physical goods will have a Data artifact, and many data artifacts will have no physical comparison. We will make no decision without a simulation. The simulations will look, feel, and be almost real. We will wade into them and out of them like walking into another room.
And what we think of today as big will get MUCH BIGGER. In 2022 the Future of Everything will be Simulation. We will Predict the Future before we Live It, and we will use Predictive Governance to make sure we can trust what we simulate.
Join the IBM Data Governance Council as we Simulate and Explore the Future: http://dgcouncil.eventbrite.com