I've written in the past about the loan origination underwriting failures that are at the heart of the current credit crisis. Market failures in Mortgage Backed Securities, Collateral Debt Obligations, and Credit Default-Swaps can all trace their lineage to high default and foreclosure rates resulting from those underwriting failures. In a piece I wrote in early 2008, I argued that simple changes in underwriting standards could have prevented the market meltdown.
I've also written about the relative efficiency of the Danish Mortgage Model and yesterday I heard an in-depth comparative presentation on that Model that I have to relate because it totally changed my point of view on the Danish Model. Up to know, I had seen the Danish Model as a business platform for mortgage processing. What I saw yesterday is a consumer solution with enormous political appeal.
The meeting was at the American Enterprise Institute in Washington, DC and the speaker was Alan Boyce, CEO of Absalon, the organization that exported the Danish Mortgage Model to Mexico. Alan presented the Danish Model in the context of what the Danes call "The Principle of Balance."
The Principal of Balance
enables borrowers to refinance their mortgages when housing prices go up AND sell their mortgage bonds at current market prices when housing prices go down to preserve their equity. In the United States, borrowers can refinance when rates decline and housing prices rise, but they have to suffer negative equity when housing prices decline. Housing prices often decline in a recession, and negative equity restrains labor mobility by nailing home-owners to their existing homes until prices rise and they can sell without a loss.
In Denmark, when recessions hit and housing prices fall, borrowers can sell their straight securitized bonds in a secondary bond market and refinance their mortgage at the current market price for their home. This flexibility protects consumers from negative equity and empowers workers with greater labor mobility.
From Alan's charts, Here is how the current system in the US works:
If Interest Rates decline:
- Home prices go up
- Homeowner can prepay existing mortgage
by refinancing at new lower rate
- Allows for equity withdrawal
If Interest Rates go up:
- Home prices go down
- Value of the mortgage (in a MBS) drops to
the holder of the mortgage
- Even though the value of the mortgage has
dropped, the homeowner still owes “par” –
the face value of the mortgage. He cannot
prepay existing mortgage at the price the
mortgage is selling for in the market
- ~$5 trillion is currently owed by
homeowners of non-agency mortgages.
These mortgages are valued by the market
at $3.5 trillion.
- In some of the hardest hit regions in the country home owners have lost their jobs and have negative equity in their homes, and they can't do anything about it.
Using the Principle of Balance
, here is how it would work:
If Interest rates decline:
- The system operates the same
- Home prices increase and people can refinance and take equity out
If interest rates increase:
I think this chart summarizes it best:
- Home prices go down
- Assuming credit worthiness, a homeowner
can prepay by purchasing back his or her
mortgage at the current discounted price
- This maintains equity in the home
- The key is new, standardized mortgage
This model doesn't perfectly preserve home equity as home owners will suffer some loss when housing prices decline, but the loss is substantially mitigated and this system offers individual freedom and choice. It is actually far more market oriented than the current US model.
In the US, we currently suffer 10% default and foreclosure rates, and there are an additional 15-20% who suffer negative equity in their homes but are not at risk of foreclosure. People in foreclosure can't take advantage of a new Principle of Balance Mortgage system, but the government can offer programs to restructure their mortgages at market value. Those with negative equity could be encouraged to migrate to a new Principle of Balance mortgage model.
This is an idea that has enormous benefits all around. It can help the Obama Administration reprice existing toxic assets. It can help provide more market-flexibility to home-owners. And it can repair confidence in the American mortgage market among investors world wide.
Who would have thought that market-oriented reforms would come from such a "socialistic" country like Denmark!?
I encourage everyone to read Alan Boyce's presentations and white papers. It is one of the most intelligent and easy to implement regulatory reforms I have seen in many years.
His full presentation: https://www.ibm.com/developerworks/blogs/resources/adler/20090325_1.pdf
His short white paper: https://www.ibm.com/developerworks/blogs/resources/adler/20090325_3.pdf
This morning, EU Regulators announced that they propose to create a Risk Board to monitor financial market performance and systemic risk indicators among the 27 member nations in the European Union. I've advocated a Council approach to risk-based decision-making since the beginning of this year and I think the EU proposal is a good idea in concept. Unfortunately, in Europe it seems decision-making takes a large number of people, becaue the European proposal would have 63 people participating on the Risk Board. A deliberative body with 63 people is not a "Board" - it is a legislature. To complicate matters, "only" 32 members of this board would have voting rights. Unfortunately, the only power they can vote on is a warning to member states that some part of their market performance contains systemic risk. How they plan to determine that threat and get everyone to agree on what it means in any reasonable amount of time is not clear. My guess is that this is a proposal to setup an intra-governmental think-tank that will study issues, write economic reports that no one reads, and only threaten to issue warnings because a vote on a warning will never happen.
Note to Obama Administration: If you want to create a Systemic Risk Regulatory Structure that is guaranteed to fail due to political indecision and lack of authority, copy the EU model.
Last night, I was one of two panelists at a Global Association of Risk Professionals (GARP) symposium on Systemic Risk at Fordham Business School in New York. We were to be a moderator with three panelists, but one canceled at the last minute, presumably to stay home and watch the Yankees lose to the Phillies last night. The room was on the 12th floor in a mid-60's squat tower accessible from two elevators among a bank of six in the stone cold open and office-like lobby. Twelve is the top floor in the building, with a Rockefeller penthouse atmosphere. Black marble floors, mahogany paneling, subdued sixties swank.
The symposium room was longer than wide, seated classroom for one hundred in three neat blocks. We panelists were paired on a white-clothed-table with microphones we didn't need. The moderator introduced us both; the NYU Business School professor and the IBM Data Governance guy. The audience looked half-asleep, and the first question rolled out on the table, "What is Systemic Risk?" Our gracious moderator had prepared a raft of intelligent questions for us that evening, but we would only get through two in the brief hour we digested.
What is Systemic Risk? The professor told us it was the result of exogenous market conditions that created upper atmospheric bubbles in complex derivative instruments capable of devastating global economies. It could be measured in the up and down-swing of aggregate equity performance and controlled through the central banks he currently advises. He saw Systemic Risk as a macro-economic phenomena, the product of weak government regulation, greed on Wall Street, outrageous compensation packages, and unnecessary complexity in financial markets.
Before the event, I wasn't quite sure what I was going to talk about. It was a hectic Monday full of ten conference calls on twenty different topics. I left late, had traffic on the Grand Central, got lost at Lincoln Center looking for parking, and there was no coffee when I arrived. I'm not an evening person un-caffeinated, and perhaps not the best morning person in the same condition. But droll media babble passed as tenured professorial wisdom will rouse me on the sleepiest of days.
Systemic Risk is the probability of loss to a system. It is not actually a thing that can be calculated. It is a series of things that result in a loss event with causality and impact. Systemic Risk is not only about macro-economic catastrophe, because to say so is to say that we are not involved in Systemic Risk accept as victims. And that ain't true. Insofar as all of us, The People, are members of communities, parties, religions, nations, and environments we are part of a System. We are inter-related, inter-dependent, capable of causality, errors and omissions, losses and claims. Each incremental failure can cascade and result in systemic exposure.
The Credit Crisis is the result of a series of public policy mistakes from 1999 to 2006 that encouraged bad business practices at many different stages of the mortgage underwriting and securitization process. These were incremental failures that contributed to loss events that destroyed parts of the economic systems upon which markets rely. The lesson to humanity from this experience is that We The People are all members of SYSTEMS large and small that can fail as a result of incremental policy mistakes. Actuarial Science has for too long focused on the probabilities of contained loss events.
My body is a SYSTEM and Cancer is a systemic risk to me. It causes a
chain of events which can result in organ failure and death. Your company
is a system, and bankruptcy is a systemic loss event. If bees die,
plants won't be pollinated, and that can be causality to a systemic
risk to our ecoSYSTEM. The BBC Reports
(http://news.bbc.co.uk/2/hi/science/nature/8338880.stm) that record
numbers of plants, mammals, and amphibians are under threat of
extinction. This is a systemic risk. When entire species of frogs in
remote places like Tanzania become extinct in the wild, humans take
note - this incremental failure is closer to your role in the food
chain than you may think.
Every System has risk. Every person in every system has a role.
If we accept the gossip-press gospel that the Credit Crisis is purely
the result of greed on Wall Street, and can only be fixed by wise regulators in Washington, shame on all of us for missing the opportunity to internalize the economic externalities. It is not an academic exercise to study the risk in every system large and small. Systemic Risk is a real-world imperative for all of us.
The Winter Solstice is the time for Data Governance Predictions. And here are mine for 2011:
1. Systemic Risk Councils will proliferate. The Dodd-Frank Bill established a Systemic Risk Council in the Federal Government to aggregate financial data from across the economy to detect patterns of exposure that can impact macro-economic policy. All Financial regulated entities should follow the leader and do this themselves. Some, like JPMC and Goldman Sachs already do this. Everyone who is not doing it should get on the wagon and replicate.
The Federal Government will take eons to gather all the data and make sense of it. And even if they do it, their will be political considerations with regards to how the data is used and disclosed. And forget about counter-cyclical policy-making. So if you want your firm to escape financial ruin in the next Sub-prime, Sovereign Debt, Greek, Irish, Portuguese, or Spanish Debt Crisis, go and get a Risk Council and start sifting the data yourselves. Processors and storage are cheap, data is widely available, what you need is the organizational structure, decision-making system, and a sound Data Governance program. Get it going now, because with all the debt the world has accumulated there will be many more crises to predict.
2. Health care will join the Information Revolution - Today, many doctors use the Internet to look up symptoms, anatomy, and, of course, pharmaceutical remedies. Yet as an industry, there are so few information resources that document the comparative performance of doctors and hospitals in how they treat patients and the results. In 2011, thanks to US health care reform, this will start to change and I foresee a nationwide movement to aggregate vast amounts of health care data to analyze and report on what works, what hurts, and start building plans to make care more efficient and more effective so that people live longer. Data Governance will play a huge role in this effort, which will start next year and consume the next decade.
3. National Incident Detection - Like it or not, the days of the Internet Wild West are numbered. While the new Republican Leadership in the House is opposed to the Net Neutrality Bill, it seems certain that some form of national security oversight over Internet incidents and threats is going to happen. The government has been trying to corral business into sharing incident information since 9/11 and I predict they will succeed at some point because nation-sponsored cyber-warfare can not be resisted by private enterprise alone. In some as yet to be determined form, new information sharing regimes will need to be designed that aggregate threat information from businesses across the nation to develop early warning systems and protect national Internet assets.
4. Self-Governing Commons - Human beings can, in fact, govern the use of common resources more efficiently than hierarchical or proprietary solutions. The Information Governance Community is a demonstration of this fact, and in 2011, similar demonstrations will proliferate around the world and Social Networking itself will mature into online meeting places where people do more than talk - they will govern themselves to produce common work products. An aggregation of people without a deliverable is a media channel. Those same people collaborating on common ideas to produce work are self-ruling corporations and this phenomena will change how people are organized around the world. Any idea or project can be accomplished by self-organizing groups of people with common interests, a governance model, and an incentive structure designed to produce an outcome to effect change.
Five years ago, we formed a Data Governance Council to change organizational behavior and effect change. Achieving Semantic Consistency, Data Quality, Single Views of the Truth, Trusted Information, and Security & Privacy are all IT goals necessary to achieving any one of the above Predictions. Information is changing the world and with information we can change ourselves. However, without Governance, all we have is Data Management and none of what I described above is possible.
My aunt Helen had an opinion on everything. She was an information junkie long before the Internet, consuming at least three newspapers a day and watching untold hours of news television. If she didn't know about an issue directly, she had enough reference points to issue an authoritative opinion. I spent many weekends in her ancient Cheshire farmhouse with the musket holes in the foundation to protect against indian raids and the secret spot behind the fireplace where slaves hid in the 1850's Freedom Railroad on their way north to Canada. Dusty newspapers from the 1960's clogged the front staircase that was never used. Every National Geographic since 1940 sat piled in closets and behind sofas. Photos and postcards sat in boxes everywhere. Nothing got thrown away. Even the dust had dust. Her home was a database, and her brain was the ultimate computational instrument, an informational repository without parallel in our family.
Helen's knowledge of the world seemed to extend way beyond the bounds of her 1730 home. When I was young, I sat in awe of her voluminous and expansive mind never daring to question or challenge any of her positions. But as I grew into adolescence I began wondering if some her statements weren't maybe a little made up, or at least extrapolations of things she knew into things she thought she knew or could know with just a little imagination. But woe to you if you challenged her without some backup because she sure did know a lot and her mind was so sharp you could be reduced to blabbering in a microsecond if you really didn't do your homework and researched a topic.
But when I got to about 20, attending college - the place you went to get important information before Google put it on our smartphones in the subway - I started to learn that lots of what Helen said wasn't quite the way she said it. It wasn't that it was completely wrong, its just that it wasn't really always black and white the way she presented it. There were lots of different ways you could see and interpret the information. And you could construct a perfectly valid and well thought out argument that tied her up in intellectual knots. And back at the farm that summer we had some great arguments. Fact is, Helen was often at least partially right and wrong about a lot of things. Not philosophically wrong, because that's a matter of belief.
Factually in error, but never in doubt.
Her conviction was the secret of her intellectual strength. We've all known people like Helen, and many of you who know me are probably already murmuring "ahh, that's where he got that..." But I didn't bring up this point to wax about my family heritage or personality. I brought it up because this characteristic is one we find every day in our organizations, in the newspapers, on the web, in our governments. People develop points of view and stick to them, and getting people to see beyond their point of view is really a challenge. It isn't that the information is wrong, its that the people interpret it the way they see the world.
Information itself is a human creation. The computer didn't put it there. It isn't immutable, dirty until cleaned, chaste, pure, imperfect until perfected. It is a reflection of us, and since we created it, its sometimes wrong or the truth is at best a mixed result.
But what's to blame for that? Your Metadata? Your Business Glossary? Data Architecture? Security & Privacy? Audit? Your Organization?
YES! All of the above. Everyone who creates and uses information is involved in its interpretation and implementation. You don't have to be a data architect to influence the way information is used in an organization. Any iPhone or Android user has a role in the information management today. Bloggers, vloggers, and photographers shape and shade their creations to effect a mood, sell a product, influence an outcome. Everyone with a data connection is a source and a target and we all must accept responsibility for how we govern the use of OUR information.
Those consultants who tell you how to "govern the data" with all those tools are not helping anyone but themselves. Tools like Business Glossaries, Metadata workbenches, Master Data Management, Data Quality Profiling, and Audit help us understand when our information is out-dated, inaccurate, partially true, or just plain boulder-dash. We use those tools to illuminate the dark corners where opinions and habits force difficult debates to unlock the truth because we know that Information is the only tool we have to change behavior.
Want to succeed with Information Governance. Get Aware. Know what's happening and share it. Use your Information Governance tools to build operational awareness.
People will change their opinions when confronted with a solid argument, and that's what you want - Change from Information.
Fact is, I learned a lot from my aunt Helen and I still hear her voice strong as ever. Sometimes wrong, never in doubt.
IBM has been at the forefront of the Information Governance movement since the formation of the IBM Data Governance Council in early 2005. For the past six years we've worked closely with industry-leading companies from around the world to tackle the biggest challenges associated with governance.
Around the world, our clients are at varying stages of recognizing the necessity of Information Governance and implementing guidelines, standards, and policies. If your or others at your company have started conversations on this topic, then this event is for you!
We would like to invite you, and 2 of your colleagues who are information stakeholders in your company, to participate in a workshop that will help you build an effective information Governance program:
- Define your needs
- Benchmark your organizational maturity
- Define your organizational structures, methodologies, and tools
- Develop new insights and build a system for information Governance
In this hands-on workshop, participants will be taken through four of the Information Governance capabilities, and asked to rank their organizations according to maturity level defined in the Information Governance Maturity Model. All rankings are confidential and you can take home what you start and complete it later with your colleagues at your convenience.
Who should attend:
- CIO and senior IT Exectuvies
- Business Analysts and subject matter experts
- Executives involved in compliance and data protection
- Data or Information Stewards, Directors of Data Governance, and Data Architects
- Consultants and IBM Business Partners
The goal of this workshop is to educate and improve. Participants will meet other practitioners and gain valuable insights through comparative discussions of common challenges. New insights will be shared with the global information Governance Community, inspiring new ideas and topics.
I hope to see you there. IBM Information Governance Workshop
is an ancient Spanish colonial city with American influences and a culture all its own on the rim of Asia. It takes several visits to appreciate that despite appearances and a host of American shops, businesses, and call centers, Manila is not a larger Honolulu, and the Philippine people are not just nicer Hawaiians. The culture, like the heat, is soft and pervasive and gently unique. The foreign influences, like the rain during the early June rainy season, hide behind clouds.
Two weeks ago I made my third trip to Manila, and hosted a Data Governance Council Maturity Model workshop in a modern hotel conference room for 25 customers spread across 10 tables of round. In my 8 hour presentation, I integrated the Maturity Model into the Six Steps to Smart Governance using both OpenOffice and the IBM Application Roadmap Tool (ART). Customers used laptops with the ART tool running to score their respective levels of maturity and I explained how the Maturity Model provides benchmarks to assess current and desired states of Maturity from which the Six Steps can be used to govern the use of data in a more scientific and repeatable way.
I've given these two presentations often, mostly in shorter conference presentations, but at least 12 times a year if not more. I constantly update my presentation with current examples and anecdotes to keep the material fresh but also to keep myself fresh and avoid the self-boredom of redundancy. But to each new audience, the material is fresh and I'm always amazed at how the Maturity Model transforms conversations from abstract theory to relevant practice.
I present five to seven charts then go to the ART tool and we run through three to six sub-categories of the model. Organizational Structures/Summary, Data Quality/Processes, Stewardship/Accountability, Risk Management/Accountability. During these phases I read the content for each level of Maturity and simulate a to-be and desired state by moving the slider bars over. Most of the audience hears my words and ignores my gestures. They are engulfed in a personal assessment of their own Data Governance maturity. Huddled over the laptops, they discuss their perceptions of the model levels, argue about what the terms mean, relate the observed behaviors of 50 companies in North America and Europe to their own habits.
It is fascinating to watch! They don't want to move forward to new categories, as each level brings forward painful memories of immature practices, problems long festering needing change, and the re-awakening that they too are immature and can change with an external assessment.
Four years after its creation by a group of 50 visionary Data Governance Council members, the Maturity Model still inspires and provides fresh evidence of its value and relevance. It excites audiences all across the world, and as a benchmarking tool there is no comparison. Every time I do this I wonder to myself how this material can excite as it does. But it is the common awareness of ad-hoc, episodic, IT adventures, crises, and budget constrained fixes over decades that motivates people to realize that their situations are not unique and that only systemic solutions will work.
After all these years, Data Governance is a real global market and the real work to make it a success just now begins.
Thank you Manila.
In 2004, when I hosted the first Data Governance Forum at Mohonk Mountain House, I had three teams of IBMers developing the narrative discussions for three tracks on a common use case. The tracks were called "Infrastructure," "Policy," and "Content." The use case was "Data Supply Chain." The Forum had two days of meetings stretched across three, starting in the afternoon of the first, going until lunch of the last. On the only full day, we hosted the three breakout meetings, and each team worked to integrate their track discussions around the use case. The use case came from some business process definitions software group had developed for business component models, something to do with insurance claims processing. As it turns out, we had only one or two insurance companies at the event, and we spent more time focusing on the track headings and business process model than on the idea of a Data Supply Chain. A conference is always the product of the people and the ideas in a room, regardless of what one puts on the agenda. And at this first event, when most of us only had the most vague understanding of what "Data Governance" was or could be, business processes were familiar and Data Supply Chains were distant.
Three weeks ago, I hosted another Data Governance Forum at Mohonk Mountain House. It was again two days of content stretched across three, and again a very diverse group of people came together to produce discussions that were engaging, powerful, and divergent from what was planned on the agenda. In three breakouts on "Data," "Risks," and "Governance," the panelists and audience exchanged ideas and I ran back and forth between the breakout rooms to listen, learn, and occasionally drive the conversations. What I heard among talks about Data as an Asset, Risk Taxonomies, Governance models, and Security & Privacy, was the loud echo of Data Supply Chains reverberating off the walls. It was like an archetype of the first meeting, the temporary suspension of historical time, as if in all these years of Data Governance we had lost the original truth, like a spring disappeared under the ground, rediscovered at the source.
Every company does Data Governance today, for ill or good, with intent or dystopia. Every company also has at least one, but often many more, supply chains. These are real supply chains that may only stretch across one or two towns or six continents. Supply chains link producers, distributors, and consumers. They enable outsourcing and resourcing. And they are a fixture of modern business since business became modern in the mid-1970's. And with disciplines like Six Sigma, large multi-national supply chains enable massive economies of scale with quality control that previously were only available to the largest organizations with fixed multi-year labor contracts.
Today every organization also has large distributed Data Supply Chains. Some parts may be automated, others batch, and still others quite labor intensive. The variety and function is often the ugly mess the CIO would not like the world to see. And with seldom exception, they are not "governed" with anywhere near the same quality control and rigor as are real supply chains. When an oil company puts down a new oil terminal, well defined engineering processes are used to map out every step of production from well head to refinery. If Data Supply Chains are intended to capture the same kinds of flows with information, the methods used are mostly ad-hoc, one-off dependent upon project leader, never to be repeated again. And the result today is that companies have tens, hundreds, and even thousands of ad-hoc supply chains designed individually, some existing in their original state for decades. The disconnects create massive inefficiencies, quality control problems, and functional friction.
What every company should be doing is inventorying their existing Data Supply Chains and begin re-engineering. There should be one Data Supply Chain engineering standard. And each new real-world supply chain should include a well defined process to create a logical and efficient Data Supply Chain that monitors itself. This is not a small undertaking. But we can't create a Smarter Planet full of sensors and instruments to monitor the changes in our real world if we do not also monitor and instrument, standardize and re-purpose, the changes in our own enterprise.
Every time I speak to an IT audience, I ask "What is Data Governance?" Of course the audience has come to hear me tell them the answer if they do not already know. But I'm more interested in what my audience thinks. Invariably, the answer has words like "Policy Enforcement," "Control," and "Compliance" in it. And to me what this reflects is a desire among IT professionals to expunge chaos and confusion from their world and create order, stability, and simplicity. Perhaps this is very human, but I think our desire to transform complexity to simplicity focuses far to much energy and attention on the world "To Be," or perhaps even on the world "Never-To-Be."
I think we need to spend more time focusing on the world "As Is," the one with dirty, grimy, confusing, and complex Data Supply Chains that are not yet instrumented, monitored, or in any way Smart. It is this world that needs the bright white light of assessment, discussion, policy, implementation, audit, and dynamic steering. This dark and dishonorable world "As Is" is the past most of our Data Governance programs struggle to change in the present with business plan funding for the future. It needs new methods that monitor the information flowing through its electronic veins, real-time auditing of the tools that are used to change it, and brand new business intelligence solutions that analyze past performance, compare them to current conditions, and predict blockages and failures.
In 2009, at the Mohonk Mountain House, back in the place where it all began, surrounded by 52 Data Governance Thought Leaders, I saw again the source that we mistook all these years - Data Governance is a quality control discipline for the Data Supply Chain.
Fix the world that is.
Does the European Union "promise to be true in good times and in bad,
in sickness and in health?" Will the Union survive the current Debt
Crisis and become more integrated or will it break apart under the
pressure and allow insolvent states to exit the common currency?
Can the United States maintain its high standard of living and reduce its debt burden at the same time?
may read these questions in the press every day and never believe they
have everything to do with Data Governance, but they very much do.
Governments make tactical decisions every day to increase debt amounts
by small fractions thinking that their incremental spending is nothing
in comparison to what others have done in the past - failing to see the
correlations between current consumption and long term systemic
With 7 billion people on the planet Earth, our
societies have become so complex it is impossible with past methods of
governance to foresee how policies impact even the smallest ecosystems.
So we rely on blunt cause and effect relationships to over-simplify our
options and fit our ideas into media soundbites. And the result is
non-correlated policies that are anything but smart or predictive.
seek to change this. We know that without new tools and techniques to
see beyond the next effect, every cause will yield policies that fail.
We are the IBM Data Governance Council and we see that Data is the raw
material of the Information Age and that effective Governance relies on
conceptual thinking, integrated approaches, correlated analysis, and a
relentless search for truth.
We call this Predictive Governance
and this meeting will explore what this means, how it works, and how we
as a Community can create predictive models that:
1. See the
Relationships between Data Quality and Security & Privacy and Data
Architecture and ILM and Metadata and Audit and Reporting and
Stewardship and Policy and Organizational Awareness and Business
Outcomes - the Forest and the Trees in our Information Ecosystems.
Model and Simmulate how new integrated policies, people and
technologies are available to Govern in these complex Ecosystems.
Understand and articulate these relationships to laymen who only see
the problems at hand and have no patience for larger integrated
Please join us for this important two day event.
Participation is open only to members of the IBM Data Governance
Council. Organizations wishing to join the Council may sign up for this
event and execute a Council Agreement in New York at the meeting.
I am a relative newcomer to System Dynamics. I first learned about systems thinking from Helmut Wilke, german professor who wrote a book called Smart Governance, which talked about systems of governance and their influences on society. I met Professor Wilke in Cologne in 2007 and was so impressed with his ideas I used his book in a course was teaching with Christa Menke-Suedbeck at the Bucerius Law School in Hamburg, Germany.
A few years later, a colleauge introduced me to some work IBM did with the City of Portland to build a very large SD Simulation enabling urban planners to understand how even the smallest policy changes had ripple effects across many municipal departments, neighborhoods, families, and individuals. We created that simulation using VenSim and Forio, and I was immediately captivated by the potential to model and simulate the impact of policy on complex environments.
IBM Smarter Cities SD Demo
For over 15 years, I've been an inventor and market builder at IBM. In 1996, I invented Internet Insurance, persuading AIG, Reliance National, Chubb, Codan, and other insurers to invest in developing interent exposure coverage products and underwriting capabilities so that businesses could depend on insurance coverage as they expanded commercial operations online. In 2001, I led a team of IBMers to create the Enterprise Privacy Architecture, which is a patented methodology for embedding privacy policies and obligations into business processes. In 2004, I founded IBM's Data Governance Council and led an international group of 60 companies to create the Data Governance Maturity Model, a vast piece of commonly developed IP that benchmarks Data Governance behaviors across 11 categories and 5 levels of maturity. In 2009, I hosted a series of roundtable forums with large banks, the SEC and the Federal Reserve as we explored the causes and effects of the Credit Crisis and what new standards in risk calculation and expression could be developed to mitigate future crises. And in 2010, I created the Information Governance Community to publish the Maturity Model under an open source license and invite a global community to work with IBM, the Data Governance Council, and many new leaders in developing a larger market for Information Governance and a new leadership role called the Chief Data Officer.
I love building markets through international collaboration and this is why I have urged and lobbied iseeSystems, Ventana, Forio, Anylogic, IBM, and the SD Society to embrace an open standards process at OASIS. SD is a complex discipline that is hard to learn and hard to use. It has grown in episodes over the past 50 years but it has never really broken out of its strong academic foundations. At first, I thought I could help it grow through the Information Governance Community. In 2011, I held a series of informational webinars on SD, the City of Portland Project, some work Steve Peterson had done with urban violence in South Boston. Michael Bean from Forio.com gave us generous amounts of his time to educate our community in how SD works, how models are built, and how simulations can be used to test strategic ideas and transform organizations. Some of our community members built Data Governance models in Vensim and tested them online in Forio.
But widespread adoption eluded us. You can have great webinars with great content and discussions, but that doesn't mean everyone understands what you are talking about. I saw many of my members thinking about systems, but not in a dynamic SD way. They understood the words we used to mean different things and found the math content totally confusing. After six months of work, I had to admit my efforts at Community education were not succeeding.
Undeterred, I started talking about the need for SD Open Standards. In the IT world, Open Standards are a way to spread adoption among vendors because it lowers proprietary barriers to entry in new markets. It enables better software solutions, which end-users appreciate. And the process of Open Standards consideration and specification approval helps build market demand. As early as 2011, I saw clearly that SD lacked a robust IT vendor community. 5 or 6 small vendors providing software modeling tools was a niche market that was not growing.
In 2012, I met iseeSystems at the System Dynamics Conference in St. Gallen. My participation in the conference was very last minute. St. Gallen isn't close to anything in Switzerland and it was summer and I didn't want to travel. But boy am I glad I did. For three days, I saw incredibly thought-provoking transformational work in every industry all using a common SD methodology. I speak at many conferences throughout the world and you never see so many interesting presentations across so many diverse industries written in a common way.
I was blown away by the quality of the content but, sadly, equally depressed by the complete lack of business participation. The conference was run by academics for academics. I was the only representative from a large IT vendor. There were no banks, insurance companies, oil and gas, utilities, governments, or even big 4 consultants attending. The SD Society had a conference in 2011 in Washington DC, so I asked the organizers how many from the federal government had attended. The answer was hardly any. Why the heck not I asked. The answer was no one had thought to prioritize their participation as a target audience. The target audience was local universities.
If the purpose of the SD Society is to service the university marketplace with educational offerings and knowledge transfer, mission accomplished. If the purpose is to grow the industry and attract business audiences, current approaches are inadequate.
This is where OASIS comes in. Following St. Gallen, I went to work persuading my colleagues in IBM that an Open SD standard based on iseeSystems XMILE could help grow business demand for SD simulations. The open standards process would attract new ideas to SD and open the SD Society to new ideas as well. But it took a lot of persuading. I had to sell a vision internally that SD concepts could be used with Big Data analytics to illustrate policy options on complex ecosystems. I had to tell my colleagues that an open standard would allow IBM to embed SD vocabulary in other modeling tools such as Websphere Business Process Modeler, Rational Method Composer, and iLog. And I had to demonstrate that our investment would be modest, the risk small, and the potential payoff reasonable. It took me a year to find the sponsorship I needed to persuade our Standards Commitee to approve IBM"s sponsorship of the OASIS TC.
And that brings us to where we are today. We have a TC. We have a vision for XMILE. These are table stakes. A TC is a sales effort, and we must now expand our market of members to be global, business oriented, diverse, and inclusive. Over the next 24 months, we have to expand TC membership to 70. I'd like to see representation from North America, South America, Asia, Africa, and Europe. I see my job on this Technical Committee is to help expand customer demand for SD solutions and build a far larger market than exists today.
We are not just building a technical standard. We are building a market and I will continue to engage my peers to expand the use of XMILE worldwide as we work to develop an Open Standard for System Dynamics at OASIS.
The US Freedom of Information Act (FOIA) is celebrating its 45th Birthday this year. It was signed into law by Lyndon Johnson on July 4, 1966, and its goal was to provide Americans with the right to petition their government to release documents deemed in the public interest which might not otherwise ever see the light of day. Since FOIA was enacted, hundreds of thousands of public records have been released. However, the process is not easy. Requests must be made in writing, documents must be found and analysed by the government, and FOIA requests can often take many years to fulfil if the government has an interest in withholding the information.
All requested records are provided on paper because in 1966 that's all there was. Computers occupied huge glass rooms and were not used for document archival or retrieval. Today of course, huge amounts of data and documents are stored on computers and the process of document retrieval should be much faster. But the government doesn't really want to make it easier because that would make government accountability far more transparent and lets face it people on the inside don't like transparency.
But the opacity of information is damaging. It creates asymmetries that favour organisations and disadvantage individuals.
Examples. You go to your doctor when you are sick and they pull out a big file on you. If you see a specialist, the doctor forwards your file to them. Get surgery and that goes in the file too. How come doctors have your data but you don't get a copy by default?
Answer: the medical industry just hasn't worked that way in the past and giving up your data means they give up some control over you as a patient. If you had your own data, you could share it online anonymously and ask many other patients and practitioners around the world to offer you options that might help you deal with your illness and find a cure beyond the scope and capabilities of your local practitioners. That capability isn't in their interest as care providers, but is in your interest. Unfortunately, your health information isn't free to obtain, and the continued opacity of your own data hurts you.
Another example. Congress is being lobbied today to pass all kinds of new restrictions on copyright infringement. Websites may be taken down if infringement is alleged, and there's even a new bill proposed to make the streaming of copyright material illegal. Why is it that corporations get so much protection for their content but you and I enjoy almost none? Why can't we copyright our Personally Identifiable Information and force organisations to pay us to use it?
You don't have Information Freedom if you can't even control your own information.
Lets reform FOIA before we celebrate its 50th Anniversary and make the Freedom of Information a universal human right. Our government should be transparent, access to trusted information should be unhindered, cheap, and universal, and we as citizens and consumers should be able to exercise far more control over our own information as a fundamental right of freedom.
Recently, I played tennis with my son. At 16, he's tall and lanky like me, but full of boundless energy and I have to play smart to keep up with him. I taught him most of what he knows in tennis and we both play at the same level - though I do enjoy when he wins. But on this day, there was no winning or losing. Our rallies were endless. We exchanged vollies, drops, topspin, and slice. If I won a point, he came back and won the next. There was no mercy and no letup. At one point, he sliced a ball low to my mid-court forehand and I had to rush from the backhand side of the court across to reach it. I'm not as fast as I once was but on this day I crossed the court with speed. As I got to the ball and lined up a chip drop, I looked up and found that my intrepid son had already anticipated that move and was rushing to the net to cut me off. I stopped short and just laughed. I said "you know what I'm going to do next, don't you," and he said "like, yeah, I know all your shots." That happens when you play with your son, because we know each other so well.
We played out the rest of the match and after I thought about that laugh we shared at the net as a metaphor for much of what I've learned about Data Governance, Risk Measurement, the financial crisis and the challenges of information and knowledgeknowlegde. You see, people are best at anticipating what they expect - especially in situations that breed familiarity. That's the reason why Value at Risk (VAR) was such a seductively attractive formula - in a largely pro-cyclical business culture, a formula that helps you anticipate what you expect (that today will look mostly like tomorrow, yesterday and the day before) is a winner. People who anticipate other outcomes are either brilliant visionaries who make "discoveries" (minority), or outliers who make trouble (majority).
I began the year thinking that financial regulatory authorities could make better policy decisions if they had the right data. But I now understand that many of them had the right data in 2005, 6, and even 7 but they didn't understand it, chose to ignore it, or lacked the political will to make radical, outlier, decisions that would adversely effect many key constituencies.
Hence my conclusion: Data Governance isn't enough. Collecting and aggregating data is an important step, but people need to understand what the data means as information, and that information needs to be communicated widely as knowledge. Not the finite biological knowledge we all have in our brains - the organic translation all of you reading this article are performing right now - but the metaphysical knowledge of a community knowing a common truth about the world so they are prepared to accept a decision to avoid an outcome they did not expect.
I don't care what kind of new Systemic Risk Council gets built at the Federal level of our government, or indeed what kind of new Regulatory Information Architecture is designed to support it. All of that is important but not as important as the steps people take to disseminate the information in both raw and interpreted form to a wide and varied constituency. The more people inside and outside the group that know what the group knows the better chance we have that outliers will interpret things the group will miss. And it is upon those outliers - the ones who anticipate what we don't expect - that crisis prevention most rests upon.
This last point is the hardest. In the financial crisis, only a few economists like Nouriel Roubini predicted the credit crisis before it began. Most of the other economists predicted it perfectly only in hindsight. But Nouriel was largely ignored by those economists and the media as "Dr. Doom," the naysayer who only saw the bad while so much good was going on. And that is human nature. If you aren't in the tribe of believers you are a barbarian, an outsider, who can't be trusted and must be demonized or destroyed.
This is of course very bad for the discovery of non-expected results, unless of course you ARE
a barbarian trying to hack your way into the group in which case you should be destroyed. Trusting what you know, where it came from, where its going, and who's going to know it and do something about it will require new forms of transparency and self-governance. George Orwell wrote about the alternative, and we don't need to follow his example.
Because what we want is Trusted Information that empowers Doubt. Doubt about what information means is essential to effective decision making. And this is where I think a new Information Governance discipline, one that focuses on the Information needs of Governance as well as the challenges of Governing the use of Information is needed.
That's at least what I learned from my son on the tennis court last week. We'll see what he teaches me today.
Last week I hosted a Data Governance Executive Breakfast for 20 CIOs in Warsaw Poland. It was my first trip to the Iron Curtain Capital and I expected a concrete grid of grim apartment complexes and monumental communist office architecture. Instead, I found a lovely city still working hard and succeeding to erase 50 years of Nazi occupation, annihilation, and communist oppression. Warsaw today is a gem of a city, with warm and friendly people, beautiful architecture, an eager business atmosphere, and a deep historically rich intellectual tradition.
My one day in Warsaw was graced with gorgeous weather and a terrific morning event that combined both Data Governance content and XBRL. My partner in the Breakfast presentation was Michal Pienchofsky from Business Reporting AG, a Data Governance Council Member specializing in XBRL consulting who is based in Warsaw. Michal gave a terrific presentation linking Data Governance goals and structures to XBRL taxonomies, regulatory compliance, and business optimization.
After the event, I met an old family friend who lives in Warsaw. Stacy is the father of my brother-in-law, and in the summer of 1944, at the age of 16, Stacy joined the Warsaw Uprising and fought against the Nazis. It was a valient and tragic effort that for three months engaged German units in a bloody campaign to win back the Polish Capital. The effort was largely unassisted by both the Americans and the Soviets - who were actually sitting outside the city some 11 miles away and waited for the Germans to mop up the resistance before liberating what was left - rubble - of Warsaw themselves.
It happened that this summer marks the 65th Anniversary of the Warsaw Uprising, and Stacy took me on an uprising tour of Warsaw, showing me the manhole cover where he entered the sewer to cross the city underground to evade Nazi patrols, the intersection where his Gozdawa Battalion setup a barricade, the churches where Nazi tanks hid in waiting, and many walls where bullet holes and plaques still mark the spots where thousands of Polish Civilians were executed by the Nazis in reprisal for the uprising.
We visited the Uprising Museum, which is a fascinating and well done museum documenting the events of the uprising. They have the B-25 that the Polish Government in exile used to send supplies to the resistance fighters, replicas of the sewer pipes that you can walk and crawl through to get an idea of what it was like - without the sewage - and many photos detailing the grim battle and the utter destruction of Warsaw afterwards. The Nazis leveled the city after the uprising was crushed as an example to any other nation that wanted to rise up against their tyrannical rule. Not one building, not one facade even, was left standing in the city.
The lovely inner city that one sees in Warsaw today was complety rebuilt by the Communists after the war. I've been to Prague many times, where it is often remarked that the old city was preserved after the war because the Communists didn't have the money to put up new buildings. I think Warsaw demonstrates the lie of that assumption. Communists obviously love good architecture and cultural heritage as much as Capitalists do, because they did a marvelous job restoring Warsaw to its some of it's pre-war splendor. There are still many sites outside the inner city where scars from WWII are visible. I haven't seen that in other WWII sticken cities, like Hamburg which was 80% destroyed by allied bombs in WWWII. Just across the street from the Hilton Hotel where I stayed there were empty lots and war ruins of buildings, which is quite amazing in the 21st Century.
But in the 20 years since the Iron Curtain has come down, already there are many modern changes to Warsaw and I can well imagine that this city, with its great people, and hunger for innovation, and rich traditions, will regain its former glory as a great city in the 21st Century.
I posted some photos I took while in Warsaw on Picasaweb. Have a look if you are interested:
It was a great trip business-wise and it certainly demonstrated the resilience of the human spirit even under the most barbaric forms of oppression.
In Europe, there is a proposal to tax Google, Yahoo, Apple, and others on the content they provide that passes over European Telecommunications networks. The Europeans complain that their networks are over-burdened with content that comes from the USA that they deliver to their subscribers and they want a cut of the action. They claim that the cost of upgrading their bandwidth, especially in France, Spain, Italy, and Eastern Europe, to keep up with content is so overwhelming that the content providers themselves should pay for the inconvenience. France calls it the "Google Tax." I call it extortion and fraud. Its bad policy for Europe and its bad policy for the Internet. In France, even though French Telecom has been deregulated, Alcatel-Lucent estimates that it will cost over 300 billion euros to upgrade the French broadband infrastructure to enable high-speed internet access for every French citizen. Other European Telecoms, also former monopolies, agree. Someone has to pay for this, why not the Americans?
Its their doing anyway, this Internet...
In France, where the "Google Tax" has already passed it already hurts French content providers more than Google because their content is taxed too and they only serve the local French market. Google, meanwhile, has a global market and only a small fraction of their revenues are effected in France. You have to love the irony...
By now, American readers are no doubt feeling quite superior since we haven't done anything as legislatively dumb yet. Don't.
Legislative stupidity is not the monopolistic right of Europe. In the United States we have the Protect IP Bill introduced by Senators Patric Leahy and Orin Hatch. This Bill allows the US Attorney General to shut down websites that are alleged to provide copyrighted materials or counterfeit goods. Apparently, current copyright laws which require copyright holders to demonstrate infringement in a court of law are insufficient to protect IP rights in the United States. What the Protect IP Bill does is remove the inconvenience of due process and allow the US government or any IP holder to demand an injunction or restraining order from a court to shutdown a website without a court case. This Bill is bad on so many grounds. The definitions of copyright infringement and counterfeiting are very loose. The lack of due process means any aggrieved party can use injunctions to shut down competition or ideas it wants repressed. And one person's piracy is another's business model.
Many organizations are using piracy for viral marketing.
Lets face it, with 2 billion people on-line today, people are producing a
lot of content. Anyone can write a book, produce a video, record a
song, post a blog and almost everyone does. With all that content,
getting noticed is very hard. And that's changing the business models
of content production. Those changes are organic to the development of
Information and Society.
The Protect IP Bill, if passed, would make it possible for political interests to allege copyright infringement on their ideas and shutdown opposing party websites. It would empower ugly forms of censorship more at home in countries we'd rather rid of censorship than show new examples. It is a huge step backwards and is as pernicious and wrong-headed as the "Google Tax."
Government should keep its uninformed fingers the Heck out of that development. On both sides of the Atlantic, taxing and shutting down content should be seen as dictatorial remedies that will do far more harm than good.
We the Internet Generation want the Freedom of Information to connect the 5 billion people not yet online with information unfettered and uncontrolled to illuminate the world. The legislative examples above represent darkness and we should all reject and fight against them.
I spent the winter break visiting my sister in Point Reyes. Point Reyes is a small, rural town on a beautiful peninsula north of San Francisco. Miles of untouched sea-shore greet cow pastures, redwood groves, marsh, and gently wrinkled hills. The town has a small elementary and middle school where my sister teaches environmental science and my nephews get straight A's. My oldest son Ben compared the courses he's taking at his school in Port Washington, NY with the courses his cousin takes in Point Reyes and noticed a significant gap - up to 2 years in math and science. Now its worth noting that my son doesn't always get straight A's in those classes in Port Washington and so we discount some of his analysis with self-preserving bias.
But looking at the difference in curriculum between this small rural school on the West Coast and the large surburban school on the East, one is tempted to ask "are people on the East Coast really smarter than the west?"
The answer: Of course not. But the way education resources are allocated via public monopoly creates a distortion that makes it seem so. Education resources are distributed on a state level through the seemingly elastic supply of teachers to student demand using fixed ratios of how many warm bodied students fill seats in classrooms. That means that small communities with low populations often get fewer educational resources than large communities. In an industrial society, which is the one that created this educational system, this formula is adequate because it provides basic levels of education to a large population whose primary productive value is measured in skilled and un-skilled manual labor.
But in a Knowledge Society, where productive capacity is measurable in intellectual value creation - this formula is dismally inadequate. A warm body in a chair is not an accurate articulation of intellectual needs for learning. Smart kids in California should have the same educational resource opportunity as smart kids in NY. But the State has created primary education as a monopoly prerogative. It isn't that the sovereign control is wrong. Its that the model is not changing fast enough to meet the needs of a Knowledge Society.
In a Knowledge Society, every student in every school is a customer in an educational market that should be designed to cater learning to that student's individual learning potential and style. The Educational System would provide real-time transparency on student learning demands and fulfillment so that parents can actively evaluate how well the Educational System is meeting the needs of each Student Customer.
Today, even in a school like Port Washington HS, we parents only get mid-term progress reports, term report cards, and SAT scores to evaluate our child's progress in School from a teacher normative perspective. That is, we get to evaluate standardized grading done from the system as a reflection of our child's educational performance. But we don't get to evaluate how well each teacher is performing in terms of educating our children, which school policies have positive or negative impacts on our children, or other factors. Its a very recent innovation that teachers provide course materials on websites at all, and most are rarely available or responsive via email or phone.
As a parent, its extremely frustrating to confront the Educational System with complaints about a teacher. Evidence is hard to accumulate and it always comes down to our word against theirs - in which case the teacher normally wins. This isn't a system designed to produce the best outcome for each student. Its a system designed to produce predictable outcomes for all students, even if a predictable percentage of them drop out at 16.
In a Knowledge Society, every drop-out at 16 carries a huge economic burden for the rest of society because that person has handicapped their potential intellectual value creation for many years if not for life. Even Students who only finish 4 years in University will be educated to a level that fails to match their Knowledge potential.
But if we had more information about what are children are learning, how they are learning it, and how each of their incremental projects, tests, and homework assignments contributed to their overall "grade," we parents could play a far more informed and intelligent role in the development of our children at home before and after school.
And if the school budget system were also calibrated on individual learning instead of warm bodies in seats, we parents as citizens might have more levers to force change in recalcitrant Educational Systems that are not meeting the needs of their Student Customers.
The Nation State as the primary supplier of primary education 1-12 needs new market mechanisms to meet the needs of the Knowledge Society in the 21st Century. There should not be a discernible curriculum difference in the learning opportunity provided in public schools West Coast to East, North to South except the capacity of students to learn and the willingness of parents to monitor their progress.
I hope someday to have the power to choose school board members based on individual report cards of academic achievement of every student in terms of their ability to learn and the System's ability to meet their needs.
That day should not be far off if we hope to succeed as a nation in the Knowledge Society.
In the last five days, a lot of people have asked many great questions that I thought I'd answer on this page to provide a better accounting of what this is all about and what we hope will result.
Q: What is XBRL?A: XBRL (Extensible Business Reporting Language) is an XML language for describing business terms, and the relationship of terms, in a report. It enables semantic clarity of terminology by standardizing a data model - the field names and their relationships - for reporting purposes.
Q: Why Do we need a Risk Taxonomy in XBRL?A: Because Risk measurement, calculation, and reporting are mysterious, arcane, and underutilized business processes in banking and financial markets and reporting standards can demystify, simplify, commoditize risk calculation as a more ubiquitous part of business decision-making.
In the insurance world, risk measurement, calculation, and forecasting are THE BUSINESS. But insurance companies don't tell you what formulas they use to calculate your premium, how they determine their own reserves, or what protocols and methods they use to pay out claims. Actuaries study for years to learn these methods, and very few business professionals - and virtually no IT professionals - have any idea how risk is measured, calculated, and reported.
Q: But what do you mean by Risk Measurement? Don't we need Risk Management?A: Sure. Risk Management is important. But only human beings can manage risk, and before we get there we need to measure past losses, compare them to current events, and forecast potential outcomes. Making a business decision without this analysis is risky. Making a business decision with this analysis is also risky, but when the inputs and decisions are recorded, we have the opportunity to learn from our mistakes and improve over time. We will never eliminate risk, but we can use scientific decision-making techniques to improve our odds.
Today, most people focus on Risk Management. They use qualitative risk assessments to imagine what kinds of vulnerabilities, loss events, and losses may be incurred from business activities. This is a valid method for forecasting and preventing potential losses. But the methods and results vary with the qualitative insight and skill of the practitioner, and they are dependent on disciplined application. Over time, it is very difficult to compare quantitative loss results to qualitative risk assessments.
We can leverage standards in risk measurement reporting to apply quantitative risk assessment to the practices of risk measurement and management so that inputs and outputs have a mathematical foundation. That foundation allows automation, and automation enables ubiquity of application. And that's the purpose of a standard - to enable widespread application and value - so that everyone can measure, calculate, and report risk; without an actuarial degree.
Q: Why do we need risk standards?A: One of the things we've seen in the current Credit Crisis is the ambiguity and confusion about risk. Regardless of whether you are a trader paid to take risks or an IT professional paid to avoid risk, it is nearly impossible to understand the incremental impact of your decisions on your department, your division, your company, your industry, your market, economy, or nation. There is just too much data today and our regulators haven't tooled up to take advantage of the information companies could produce to help regulators and markets operate more transparently.
We know now in dramatic hindsight that incremental risks have systemic impact. People can only understand that impact when they can aggregate the incremental losses in the past, compare them to current circumstances, and make forecasts about the future.
To aggregate and compare risk data, we need standards and XBRL seems to us to be the most logical and effective tool to create those standards.
Q: How could the XBRL Risk Taxonomy be Used?A: These standards will enable more effective risk measurement and reporting within firms, new macro-economic tools for regulators and policy-makers, transparency for financial markets, and a more ubiquitous use of risk calculation in decision-making across innumerable disciplines.
Let me give you an example:
The insurance industry does risk calculation all the time. If you are a doctor, lawyer, accountant, or financial advisor, chances are you buy professional liability insurance. When you apply for the coverage, you tell your insurance company about yourself, your business activities, past losses, claims, and insurance coverage. The insurance company will compare your application to their own database of insureds, losses, and rates.
The insurance company will also compare your loss profile to claims data it purchases from the Insurance Standards Organization (ISO). ISO aggregates loss data from insurance companies across the US and provides anonymous records back to the same companies. Insurance companies need that 3rd party verification of loss data for loss rating and trending. No matter how large an insurance company, and no matter how many years a company has been doing business and collecting loss history, everyone compares in-house data to aggregate industry data. Its a larger statistical sample size and it helps everyone set aside the right amount of premium from each insured for reserves to payout future losses.
We need the same kind of system in the financial markets. It is partially there today. Under the Basel II accord, banks are required to report the amount of gross income they set aside to self-insure against fore-casted losses. But they only report that in the aggregate. No one is reporting the underlying data from which the risk reserves are calculated, and data reporting on that level could have huge benefits.
One benefit is that regulators could compare reported loss information across national and international economies. This would provide enormous new insight into macro-economic trends that could help reduce business cycle volatility.
Another benefit is that banks and financial firms could compare their own loss information to very large samples of industry losses. This would make their own forecasting models far more efficient and that would help everyone manage risks more effectively and reduce paid losses over time.
A final benefit is that markets and rating agencies would gain new insights into underlying exposures in financial instruments and that would enable far more accurate and timely forms of risk rating, making markets more transparent and efficient.
Q: Why is the Data Governance Council leading this standards initiative?A: Because Risk measurement, calculation, and reporting within and between enterprises is not possible without semantic clarity around how we classify, describe, and document incidents, losses, events, formulas, and a host of other terminology. This is a very complex topic, and it is so easy to be confused and confounded by the terminology. Before we can all talk about this topic intelligently, we need a common vocabulary. That vocabulary will enable efficient communication, transferable methods and skills.
And this is very much a Data Governance challenge. The Data Governance Council has been studying these issues for four years and - together with our partners in the FSTC, EDM Council, OCEG, and other organizations - we think we can make a difference with this standard.
Q: Why would organizations want to apply XBRL to risk?A: We can see clearly from the subprime credit crisis that there are still some non-standard methods for appraising risk. We don’t have semantic interoperability to allow us to take an aggregate look at risk across multiple organizations. This makes it hard for companies and regulators to agree on what risk there is and it is difficult to consistently report the risk companies are taking. XBRL can be a tool to help organizations use common standards for the way risk is described.
Q: What benefit would XBRL for risk reporting provide companies and regulators?A: By translating risk reporting into a consistent software language, this will enable organizations to more easily perform advanced analysis, meaningful research and compare risk and loss history among multiple organizations. It could be used for internal reporting purposes or external. Regulators could use it potentially to create a global loss history database of anonymous credit, market and operational incidents, events, and losses from every institution, much like the insurance industry relies upon. XBRL could make risk simpler and more powerful and that should create broad market benefits.
Q: What are the primary obstacles to the adoption of XBRL for risk reporting?A: The real challenge is not in creating a risk taxonomy using XBRL. The challenge is getting agreement upon it and ensuring there is willingness worldwide to use it. That is why the Data Governance Council is seeking input from organizations and regulators worldwide.
Q: Who is supporting this initiative?A: In addition to more than 50 IBM Data Governance Council members, the Securities and Exchange Commission, the Enterprise Data Management Council, the Financial Services Technology Consortium, the Organization of Compliance, Ethics, and Governance, XBRL International and XBRL.US are all contributing to the process.
Q: How far along are you in the process today?A: We have a starter taxonomy that we will begin socializing at an XBRL for Risk Forum on February 26-27 at the Levin Institute in New York. The Data Governance Council’s role is that of a facilitator, seeking proposals and comments to begin defining a taxonomy for risk that can be agreed upon by many organizations worldwide. This work will continue through the first half of next year with a final recommendation expected by the end of the year.
Last August seems like a long time ago. There have been so many miles traveled and so many purchases made since, that what I did on August 6th is beyond my recall. Amazon doesn't seem afflicted with the same memory challenges because yesterday I received a note from Amazon "customer service" notifying me that they would soon charge me for the 4 furnace filters I had not returned from my August 6th purchase. Sadly for Amazon, this was the last straw in an otherwise congenial relationship and the story, in all its data details, does illustrate the enormous challenges firms face today in understanding what their data means across many disparate systems.
As you can guess, I purchased some furnace filters from Amazon on August 6, 2009. They come 4 in a box and they are 25x20x5, and at that size the box costs $122 with shipping. Orderd on the 6th, a big box arrived on the 12th. Once opened, only one filter was apparent. A quick check to the account demonstrated that I had indeed ordered and paid for 4, but only 1 was sent. Try and call Amazon and you will find they have no number. In fact, trying to communicate with them is very difficult. An email form is buried in their site. I found it, and sent a note asking why they sent 1 when 4 were requested. Would the other three arrive shortly? I needed one urgently, but we use 4 a year so a full shipment was required.
Within a couple of days an apologetic note from Amazon was returned. They offered several attractive options:
1. Return the filter for a full refund.
2. Keep the filter and get a full refund.
3. Get a replacement shipment of 4 filters at no additional cost.
The first filter was already doing service in the furnace so taking it out didn't seem convenient. I still needed more filters for the winter, so getting 4 more for no additional cost was very appealing so I opted for that choice. In two weeks another large box arrived with the four filters. At this point I was very happy with myself and with Amazon and thought the matter closed.
Three weeks later, Amazon sent a strange note asking me to return the filters I had purchased. I didn't read it fully as I guessed it was a mistake and ignored it. A week later, Amazon debited $122 from my account for no apparent reason and when I checked my email I found another note explaining that since they had not received my RMA they would now debit my account for the purchased filters. I immediately sent Amazon a note asking why I was now being charged twice for the same filters. A day later, Amazon responded with an apology and claimed that they had two different customer service systems and the other one had somehow malfunctioned. Of course, this wasn't a straightforward transaction. I didn't just buy something and have it shipped. Amazon had made a shipping error and offered a non-standard solution that somehow didn't conform to their accounting system. Unfortunately, Amazon made their mistake my problem.
I thought the matter closed until yesterday, when I received another reminder note to send back my RMA Filters. I wonder what Amazon now wants with 2 dirty 6-month old furnace filters. I'd be happy to send them back along with all the household dust I can find because after this latest "customer service" snafu, furnace dust is about all that Amazon will be getting from me in the future.
The internet is the world's greatest strip mall. If one store can't meet your needs, there are others who will.
If you have a Data Governance program today you already know its easier to start one that do one. Real governing is not like a Hollywood movie. Its hard to know what's wrong, why its wrong, how to fix it, and how to get people to care or follow the fixes. And you have to do this every day and all the gurus tell you to get metrics and KPI's, build a framework and follow my process. But those gurus don't live your life, they don't work in your space, and they don't have to make tons of messy compromises to get things done.
But you do, and you know that Governance is tough stuff.
In the Data Governance Council, we know that too and we want to help. We helped build the market with the landmark work we did on the Maturity Model. That gave you a way of knowing that what your already know isn't enough. You could use it to help others realize it wasn't enough too. And that gave you a place to start your program.
Well, now that you are in the thick of it, we think there's a way to communicate how your organization really works - to simulate your environment so you can help folks learn what's going on, how stuff gets done, and what would happen if you made some changes. We know you do that anyway, all the time. But we want to help you do it in a safe test environment before you put your ideas into production.
We call this Predictive Governance - the SCIENCE of describing the world as it is to run simulations on how we'd like it to be. Normally, most folks do it the other way around... The simulate the way they think the world works so they can describe how they want it to be...
Now I could tell you all about how this new way of working is going to look, how its going to help you, and what its going to do. But its more powerful if you see it for yourself. What I'm sharing with you today is an early preview into the Predictive Governance Simulation we are building. Its not pretty or polished, but it works and you can play with it now.
Have a look and let us know what you think:
If you'd like to join the IBM Data Governance Council and help us do more with this, drop me a line.
Data Governance isn't a new word for the same old stuff. If your organization isn't achieving sustainable results from your data and information management projects, Data Governance can help. But you'll need to do more than just adopt a new name. You'll need to do something far harder - you will need to change how you work and how your IT systems work.
This isn't easy. Best practices, Maturity Models, and Starter's guides can help. But at the end of the day if you don't change, everything stays the same and the results are desultory and predictable.
I meet a lot of people who ask me about the Data Governance products or roadmaps organizations should buy. The best products you can buy are the ones that tell you what you don't already know. To govern effectively, you need to know what's going on in the context to when it is happening, what it means, and how it relates to other things. Governance without awareness is a dictatorship of ignorance - people make decisions in their comfort zones because they don't know any better and don't know that they don't know any better either.
OK, nice words Adler but what does that really look like? It looks like Android.
Last week I switched from an iPhone 3GS to a Samsung Galaxy S. Lots of reasons behind the switch, a primary motivator for me is that Android is based on Linux, which in turn is based on the collective contributions of a global community coordinating their ideas for the common good. I like that, and I like Android because as a mobile operating system it integrates lots of disparate applications to provide me with useful information when I need to know it.
Example: Boingo. Boingo is a wifi service that works in some 80,000 airports, hotels, and other hotspots around the world. You pay a monthly service to Boingo to connect for "free" in these hotspots. Very hand for a global traveler. On the Iphone, you have an app but you have to first connect with the iPhone to a local hotspot and then see if Boingo works there. This is an example of the old, industrial model of application development. A single application developed for a singe purpose that the operator has to initialize.
In Android, Boingo is integrated into the wifi backbone of the phone and the information notification system. As I drive around my neighborhood, the phone alerts me automatically when I enter a Boingo hotspot and can connect. It tells me what I don't know and helps me take advantage of services I may need. It is intelligent and by sharing information it offers me new opportunities. It gives me content and context, when I know I need it and when I don't.
That's the point of Data Governance. You need to learn what you don't know and help others to benefit from that information. You need to enable and empower new information sharing technologies and methodologies. Include the excluded, bring in the outliers, benefit from diverse points of view and find new solutions to age old problems that have befuddled and bemoaned your organization for decades. You can't warm over the same old stuff and call it Data Governance. You can't govern data, manage information or knowledge because these things are inert.
But you can govern people and empower their decision-making with trusted information and insight about what's going on every day that they don't already know. Because with knowledge, human beings can change their behavior and that's what Data Governance is all about - changing organizational behavior.
This isn't a small thing. This is a very big thing. Its about the influence of Information on organizational structures, how corporations change how they work in an Information driven transformation. This change isn't coming from within. We aren't transforming organizations with information. My god, if that were the case we would have succeeded decades ago with the first mainframes. What's happening today is that our organizations are being confronted with the change of billions of new sources of autonomous information production we don't control. This is the mass of humanity communicating with each other over the Internet with the speed of now and the intimacy of a small village.
We aren't transforming with information we are being transformed by information, and this is a wave of change we are either riding or drowning in.
Newspapers, Magazines, Music and Movie production are already being replaced by global and autonomous information distribution. Not everywhere, not all at once. But even the strongest brands feel the pressure and are adapting to change. In the beginning they will change their models of distribution. Soon after, they will change models of work.
Industrial models of organization - Thomas Gradgrind and the repitive drudgery of assemblyline work, the process controls and enslaving stopwatch measurements of efficiency - these last vestiges of the way we worked in the latter 19th and 20th Centuries hold on in our organizations like a virus resisting antibiotics. There are power structures invested in these models, and they will continue to hold on for some time yet to come.
But you need to ask yourselves. Where do you want to be working, in the past or in the future? Riding on the wave or under it?
Change isn't just a word. Data Governance isn't an option.
On Saturday, I took the train from Brussels to Cologne. The train is one of those modern ICE's - sleek, clean, quiet, and fast. The terrain through Belgium is hilly and the tracks pass over rolling fields, deep ravines, and wooded glens. As we neared the German border, the landscape leveled out and the train picked up speed, reaching 200 k/hr at one point. And as the small towns whisked by, I couldn't help think how magical it is to travel from Belgium to Germany via train with no border crossing and no passport control. It is so simple and easy, and without even a word you pass from one country to another.
This is a marvel of modern Europe, and it reminds me that the last 65 years are the longest period of peace in Central European history. Europeans have somehow, perhaps accidentally, realized a reality about modern warfare that has yet escaped the United States of America - modern war is Dumb Governance. For during the same 65 year period the United States has been involved in five large-scale wars lasting over 5 years each, 6 smaller military adventures, and of course one very long Cold War.
If you read my last blog post, you will understand my statement and reasoning that modern war is Dumb Governance. To paraphrase Von Clausewitz, war is the extension of diplomacy by other means. That is, it is an articulation of national policy - the communication of it.
Now back up a minute. If we have a policy that is communicated, according to the principles of Smart Governance it must also have had a decision-making process, some metrics and business case, hopefully either sustainable or situational goals, and some measurable results that we should care to compare to the goals.
In the old days, back before the industrial Revolution, it took 8 people working on farms to support two people working in cities. That meant that you had to have a lot of arable land and unskilled labor to support those cosmopolitan types in cities who made all the decisions. War then was one means to acquiring more arable land for civilized expansion. If you conquered more territory through war, you could expect to feed more city dwellers who produced more income via trade and crafts and that made your society wealthier.
In the early Industrial Age, this logic began to wane because industrial capacity isn't only dependent on land and labor. Its also dependent on capital, and capital tends to dry up when tanks cross borders. Of course, natural resources are also important to industrial economies. But warfare tends to be a fairly resource intensive activity so gains won on the battlefield can be difficult to hold and the net benefit of acquired resources can be undermined by the resource drain of battle.
In the Information Age, knowledge is power and both intellectual labor and capital flow so freely throughout the world that warfare gains on the battlefield don't provide sustainable balance sheet benefits. In fact, they are a net cost to any society waging war.
Think about it for a minute. On 9/11 the World Trade Center was blown up by heinous terrorists based in Afghanistan. Immediately, the United States sent 30,000 troups to invade that country. The stated goal of this policy was to protect Americans from terrorism. The measured need for the policy was the attack on 9/11. The policy decision was made by the President of the United States with full support of Congress and the American people. The policy was communicated with 30,000 American troops and a good contingent from international allies.
And the outcome? Eight years later, we are still occupying one of the poorest countries in the world with over 60,000 troops. Afghanistan is not even a real nation in modern terms. It is a tribal collage of small warlord controlled fiefdoms. Pakistan is barely a modern state, and Afghanistan is 40 years behind Pakistan. Kabul has 4 million people and 95% of them have no running water in their homes. The GDP is only $12.8 billion. It has no agriculture, no industry, few natural resources, no significant knowledge resources.
The war in Afghanistan has cost US Taxpayers $172 billion to date. That is 13 times the GDP of the entire country. We are spending more each year to wage war than Afghanistan is even worth.
Compare the Outcome to the Goals. From an economic perspective, it's a huge loss.
War today is a net economic loss for any country that wages it. Resource control is simply not worth the costs. The Europeans have figured that out. That the US could learn the same lesson...before we bankrupt our nation through warfare...