Two years ago, I met Helmut Willke, the author of Smart Governance: Governing the Global Knowledge Society, at a hotel cafe near the great cathedral of Cologne. Professor
Willke is a sociologist who teaches Global Governance at the Zeppelin
University in Friedrichshafen, Germany. Late in 2009 I became
interested in Governance as a system of decision-making and Professor
Willke had written an excellent book exploring this topic. While the
Professor is German, he writes extremely well in English and his book
very well written and insightful. Like a lot of philosophical texts, it
is not an easy read. Dense descriptions, long sentences, and theory
backed by ample example make it a book you have to read at least twice
to fully comprehend.
I was in Cologne in late February 2010 to meet the CIO of the City and attend Rosenmontag at City Hall
. I had already seen several days of Karnival, with the endless parades, costumes
and candy strewn about the streets. For five or six days in February,
the staid and reserved city of Cologne becomes an endless drunken party
attracting visitors from all over the world who wear outrageous costumes
and march in parades on incredible floats and throw candy to the
bystanders. Its unlike any parade I have ever seen. Quite amazing.
It had snowed a lot that year. It was white from Brussels to Berlin,
and Cologne was still covered by eight inches. The square in front of
the Dom was clear, and I had spent the morning before our meeting
visiting the Roman museum across the square. Cologne is an ancient
Roman city and the ruins are collected in a fantastic museum right next
to the Dom. Of course there are columns and pediments, but also beautiful mosaic floors, jewellery, stained glass,
and decorative arts. There is a model of the Roman city and you can
see how the Germans built the city on the same street grid with walls
built on top of the Roman walls. Of course, much of this was destroyed
by allied bombs in WWII, but some remnants remain.
Looking back at Roman colonial rule of Cologne was an excellent
introduction to the systemic ideas of Governance Professor Willke and I
discussed over coffee that afternoon. He is not a tall man, mostly grey
late-50′s I would say, with bright blue eyes. He makes an immediate
impression, and is passionate about his book. I had used the book as
text for a class I taught at the Bucerius Law School on Data Governance
in Hamburg that January. My students did not entirely appreciate the
dense prose and abstract ideas, but through class conversation we did
ultimately appreciate the idea that Governance is a system of
decision-making that could be described and modelled. And we used
Social Networking metaphors to explore the idea of policy-making, human
behaviours in a system of Governance, and how to model potential
outcomes. Of course there is political science, which describes
political models of Governance – Democracy, Dictatorship, Monarchy, etc –
but what is unique and important about Professor Willke’s book is the
application of systems theory to Governance.
We had some coffee and talked mostly about how the Professor wrote
the book and why. As I had in 2007-8, the Professor had used the Global
Credit Crisis as a use case to describe failures in Governance. I had
covered this topic from a Data Governance perspective, arguing that
hundreds of incremental failures in business processes and data quality
had produced a domino effect that plunged the global economy into
Depression. He covered the topic from a decision-making perspective,
and while we approached this topic from different directions we arrived
at similar conclusions – policy-makers can’t possibly make the best
decisions without understanding the consequences of those decisions on
incredibly complex and interconnected global systems. And those
consequences are impossible to understand without new information
systems that render the complexity with software and illustrate how the
policies will be accepted and resisted.
In my class at Bucerius, my students complained that the Professor
had not done enough to provide solutions to the problems he had
identified, or that his solutions were too abstract. I presented these
criticisms to him at our meeting and he responded that it was not
possible to offer concrete solutions because every systemic problem
needs to be modelled to understand the variables and outcomes – that
there is no one size fits all. At the time, I thought this was a
dodge. It took me a few more years to understand that he was right.
There are no Governance Solutions that can auto-magically produce the
best outcomes for every decision. But it is possible for policy-makers
to use systems theory and software to construct decision-making models
that can plot many of the actors, objects, variables, and potential
outcomes to understand the impact of policies on complex systems made up
of hundreds, thousands, and even millions of human beings with unique
After my course, I synthesised concepts from the book with ideas from my students to create the Six Steps to Smart Governance.
It’s not meant to be a Framework. Frameworks and models are nice tools
to help people feel more secure about challenges they seek to overcome,
but they are not useful in making better decisions. The Six Steps are
meant to be a structure for decision-making that one would apply
iteratively; in which each of the six steps would involve different data
points and variables. Of course, it is highly summarised, flavoured
with marketing. And I would say in hindsight, its not really useful as a
practical or operational tool. It’s really just a theory, a
simplification of the better documented ideas Professor Willke writes
about in his book.
And I think we can do better. In the IBM Data Governance Council we
will soon begin to explore dynamic simulation models that go far beyond
the Six Steps to Smart Governance, and I recommend reading both the white paper and Professor Willke’s book:
Smart Governance: Governing the Global Knowledge Society
Today, thanks to really powerful simulation software, we can create
dynamic models that help demonstrate the impact of policy on people,
processes, and technology. The Data Governance Simulation Project will
revolutionise the field of Data Governance by applying theory, software,
and observed practices to an interactive model that will yield powerful
insights into Data Governance Value Creation and Risk Mitigation.
A lot of people ask me, “how do I show the value of metadata?” Some
say, “how do I make the business case for Data Governance?” Consultants
and Gurus will have a framework or process to offer you, a get started
guide with use-case examples, graphics, and legends about their
successes. But these myths won’t help you, because your challenges are
unique. Your politics are special, and your people are not machines.
Best practices are useful examples of glorified solutions that are very
hard to replicate outside the lab. And as many are already finding out,
people resist policies they don’t think apply to them and its really
tricky to understand how to change organisational behaviours on an
on-going basis without policies that dynamically change with new
Data Governance is, by nature, a systemic challenge and you can’t
solve systemic problems without systemic solutions. Projects and teams
that expect quick hits and 90-results are the reason you have systemic
Data Governance problems in the first place. But it is possible to
create software models that allow you to plot the goals, metrics,
policies, communications, outcomes, variables, and modifiers and
evaluate the impact of new policies and controls on your environment.
And that’s the lesson of Smart Governance: you can model complex
environments through Simulation and make better decisions. To learn
more about using Simulations to make better decisions, take a look at
the IBM Smarter Cities Demo.
In that demo, the complex interactions of human beings living in a city
are compared to the goals of human policies, the metrics measured by
interactions, and potential outcomes.
Many of our organisations are as complex as small cities. Policy and
Politics share the same ancient Greek root word – epolis. epolis is a
city, which itself is an aggregation of human beings who require
Governance to arbitrate their diverse interests and achieve better
outcomes for all. Today, we can simulate those interactions and help
Policy makers profile the impact of their policies before they are
deployed. Its a kind of Visual Risk Calculation.
If you would like to participate in the Data Governance Simulation
project, please read the Six Steps to Smart Governance White Paper, the book
by Professor Willke, and join the IBM Data Governance Council by executing this membership agreement.
Only members of the Council will be able to participate in this
exercise and you don’t want to miss this because it will fundamentally
change Data Governance.
On Saturday, I sat with an old friend at a secluded restaurant on a grassy river bank North of Bangkok. We are both actively engaged in the banking industry as observers, speakers, and peripheral participants. My friend has a more direct engagement with a Thai Bank but still as an adopted outsider. Lunch was excellent, and we sat on a wooden pier just feet from the river's edge as barges, trawlers, and all manner of ships slowly passed by with and against the current. A pair of large floor fans blew hot air our way and an umbrella shaded us from the searing sun playing tag with the clouds above. The heat in Thailand is soft, enveloping, pervasive, and quietly oppressive. You have no hope of resisting its dictatorship. Somehow the Thais have developed a sweating immunity to their own condition, whereas this Western visitor is deficient in that regard.
During lunch we compared current events in both Thailand, where the Red Shirts have barricaded themselves behind sharpened bamboo poles and tires doused with gasoline. Their encampment was many miles from our lunch spot, and indeed encompasses but a small corner of the entire city of Bangkok. Yet their determination to resist the current government, who themselves are only in power due to a similar incident involving a Yellow Shirt protest two years ago, has driven away western tourists and continues to cause confusion and insecurity in the highest elements of Thai society. And we discussed the Credit Crisis, Greek Debt, US Politics, and Regulatory Reform.
On Greek Debt, we discussed how the former Greek government hid the massive debt it had accumulated from EU Regulators (reporting a deficit of only 3.5% each year instead of the 12% it actually was accumulating), and how this massive amount came to light only with a change in government - when one group had an interest in reporting the bad data another group had an interest in hiding. Most today call this an act of Fraud, but it also has to be admitted that it was not just the former Greek government who had an interest in hiding their debt. The Germans, French, Belgians, and perhaps even the European Central Bank had an interest in ignoring the reality of Greek economic underdevelopment and overextension.
The data about Greek debt was available. Greece can't borrow on the black market. Their debt has to be issued in
bond markets, and the amounts, yield, and maturity dates are all public
record. Bond markets are largely transparent. But Transparency creates its own information asymmetries. First, the availability of information doesn't mean everyone collects the same amounts, has the power to use it, or knows what it means. Second, there is a private sector deference to public sector data aggregation, analysis, and reporting, and the public sector relies on static information reporting programs that limit source authentication, audit, and repudiation. These two behaviors allowed the Greek Government to report fraudulent deficit figures to the EU and the EU didn't bother to verify that information against publicly available market data.
One could argue that the construction and expansion of the EU Common Currency without adequate audit powers created an environment rife for fraud, but this is too easy. EU regulators could have at any time used data from bond markets to verify Greek debt. Why the EU didn't monitor the discrepancy between public reports and private market data has more to do with EU politics than Data Governance.
Every government is comprised of politicians who owe their hold on power to public perception. Everyone in Europe played See No Evil, Hear No Evil, Speak No Evil on the subject of emerging market debt in the EU. The information was available. Net inflows of financing and debt accumulation can be gained by studying the bond markets. Public obligations in Greece are also no secret. Everyone in Europe knew that pension guarantees starting at age 50 in Greece were a ridiculous luxury in a country with such low productivity and wages.
Transparency and Reporting do not, in themselves, guarantee that anyone is using or validating information sources correctly. Every report needs to be validated with external sources, because Transparency is not the same as the Truth. If the EU wants to fix this structural problem in its own multi-nation confederation, it will need to create an independent auditor, like the US Government Accountability Office, whose role it is to audit member programs and reports, to discover waste and abuse.
All reported data must be verified. If we didn't learn this in the Mortgage Credit Crisis, now is the time to take it home in the Sovereign Credit Crisis.
Banks, Hedge Funds, and other investment institutions should not wait for the EU and other governments worldwide to get the audit role right. They should build their own Information Analytics programs to validate the assertions of governments as well as listed companies because what Greece did is not new. Fraud is a part of business.
Data Validation should be seen as an important part of Market and Credit Risk Measurement and Mitigation programs. This is where Data Governance and Risk Management intersect, and new technologies will be needed to make reporting aggregation and analysis easier and faster.
On the river, in Bangkok, I asked my friend if his bank monitored the market and credit activities of their Thai competitors. They do not. They expect the government to collect data from every bank, aggregate and report that to the banking community. And his bank reads those reports. I would argue that the events of the last three years clearly demonstrate that governments are not well equipped to be doing primary market data analysis on behalf of themselves or any industry. They lack the technology infrastructure and the analytical skill to make intelligent use of the data the market already provides and their political dependencies create natural conflicts of interest.
Businesses must perform their own due diligence to verify government reports and conduct primary market data analysis of every potential investment opportunity.
Unverified data should not be trusted. This is Data Governance Rule #1.
It starts with the definition. Systemic Risk is the risk inherent to an entire market or market segment, so says one website. From the definition, we can already see that systemic risk is primarily concerned with the prevention of risks to The System. The System in question is the global financial "system." So the goal of Systemic Risk is the prevention of loss to those involved in The System.
You might ask, "why is Adler focusing on the obvious?" Well, I don't think it is that obvious who The System is and what their interests are. There are lots of well meaning people running around trying to craft new laws and methodologies to assess and prevent Systemic Risk. Most of them will fail without first understanding the needs and interests and goals of The System.
The Goal of every System should be to serve the needs and interests of The Customer. Corruption of The System is when individuals or groups place the needs of themselves as actors in The System above the needs of The Customer. When The System is mostly serving the needs of itself, it is mostly corrupt.
The Global Financial System has had corruption for decades. In the last decade, the influence of Systemic Actors exercising the needs of themselves over the needs of The Customer has become acute. The Financial Meltdown of the past 30 months is the result of this imbalance.
So I wonder, which new brew of experts and which new conference will measure the needs of The System and compare them to the needs of The Customer to assess Risk?
I have a self-serving answer:
We want our leaders to make the right decisions. But often they lack the right information. And when they do, they "trust their gut." Unfortunately, the stomach is the wrong organ for decision-making and ignorance, fear, and prejudice are often the poor substitutes for trusted information and rigorous analysis. And we know the result.
We don't want every decision belabored by bureaucracy. We want decisions that are Smart; informed by what we know we don't know, contextual to past experience, compared to current conditions, and prepared for future exigencies.
Few organizations have mastered this process today. Few in fact have any notion of what information they need to make Smart Governing Decisions. But everyone wants to. The aspiration is universal, because the competitive challenges in a flat world make bad decisions very costly.
Smart Decisions demand Smart Information - Information about what's going on. Operational Awareness. That's the kind of information we will be exploring at The Smart Governance Forum I will be hosting on February 1-3, at the Ritz Carlton Half Moon Bay in California.
I think we can make a difference and I hope you will join us. Smart Governance Forum Agenda
Last night, I was one of two panelists at a Global Association of Risk Professionals (GARP) symposium on Systemic Risk at Fordham Business School in New York. We were to be a moderator with three panelists, but one canceled at the last minute, presumably to stay home and watch the Yankees lose to the Phillies last night. The room was on the 12th floor in a mid-60's squat tower accessible from two elevators among a bank of six in the stone cold open and office-like lobby. Twelve is the top floor in the building, with a Rockefeller penthouse atmosphere. Black marble floors, mahogany paneling, subdued sixties swank.
The symposium room was longer than wide, seated classroom for one hundred in three neat blocks. We panelists were paired on a white-clothed-table with microphones we didn't need. The moderator introduced us both; the NYU Business School professor and the IBM Data Governance guy. The audience looked half-asleep, and the first question rolled out on the table, "What is Systemic Risk?" Our gracious moderator had prepared a raft of intelligent questions for us that evening, but we would only get through two in the brief hour we digested.
What is Systemic Risk? The professor told us it was the result of exogenous market conditions that created upper atmospheric bubbles in complex derivative instruments capable of devastating global economies. It could be measured in the up and down-swing of aggregate equity performance and controlled through the central banks he currently advises. He saw Systemic Risk as a macro-economic phenomena, the product of weak government regulation, greed on Wall Street, outrageous compensation packages, and unnecessary complexity in financial markets.
Before the event, I wasn't quite sure what I was going to talk about. It was a hectic Monday full of ten conference calls on twenty different topics. I left late, had traffic on the Grand Central, got lost at Lincoln Center looking for parking, and there was no coffee when I arrived. I'm not an evening person un-caffeinated, and perhaps not the best morning person in the same condition. But droll media babble passed as tenured professorial wisdom will rouse me on the sleepiest of days.
Systemic Risk is the probability of loss to a system. It is not actually a thing that can be calculated. It is a series of things that result in a loss event with causality and impact. Systemic Risk is not only about macro-economic catastrophe, because to say so is to say that we are not involved in Systemic Risk accept as victims. And that ain't true. Insofar as all of us, The People, are members of communities, parties, religions, nations, and environments we are part of a System. We are inter-related, inter-dependent, capable of causality, errors and omissions, losses and claims. Each incremental failure can cascade and result in systemic exposure.
The Credit Crisis is the result of a series of public policy mistakes from 1999 to 2006 that encouraged bad business practices at many different stages of the mortgage underwriting and securitization process. These were incremental failures that contributed to loss events that destroyed parts of the economic systems upon which markets rely. The lesson to humanity from this experience is that We The People are all members of SYSTEMS large and small that can fail as a result of incremental policy mistakes. Actuarial Science has for too long focused on the probabilities of contained loss events.
My body is a SYSTEM and Cancer is a systemic risk to me. It causes a
chain of events which can result in organ failure and death. Your company
is a system, and bankruptcy is a systemic loss event. If bees die,
plants won't be pollinated, and that can be causality to a systemic
risk to our ecoSYSTEM. The BBC Reports
(http://news.bbc.co.uk/2/hi/science/nature/8338880.stm) that record
numbers of plants, mammals, and amphibians are under threat of
extinction. This is a systemic risk. When entire species of frogs in
remote places like Tanzania become extinct in the wild, humans take
note - this incremental failure is closer to your role in the food
chain than you may think.
Every System has risk. Every person in every system has a role.
If we accept the gossip-press gospel that the Credit Crisis is purely
the result of greed on Wall Street, and can only be fixed by wise regulators in Washington, shame on all of us for missing the opportunity to internalize the economic externalities. It is not an academic exercise to study the risk in every system large and small. Systemic Risk is a real-world imperative for all of us.
Recently, I played tennis with my son. At 16, he's tall and lanky like me, but full of boundless energy and I have to play smart to keep up with him. I taught him most of what he knows in tennis and we both play at the same level - though I do enjoy when he wins. But on this day, there was no winning or losing. Our rallies were endless. We exchanged vollies, drops, topspin, and slice. If I won a point, he came back and won the next. There was no mercy and no letup. At one point, he sliced a ball low to my mid-court forehand and I had to rush from the backhand side of the court across to reach it. I'm not as fast as I once was but on this day I crossed the court with speed. As I got to the ball and lined up a chip drop, I looked up and found that my intrepid son had already anticipated that move and was rushing to the net to cut me off. I stopped short and just laughed. I said "you know what I'm going to do next, don't you," and he said "like, yeah, I know all your shots." That happens when you play with your son, because we know each other so well.
We played out the rest of the match and after I thought about that laugh we shared at the net as a metaphor for much of what I've learned about Data Governance, Risk Measurement, the financial crisis and the challenges of information and knowledgeknowlegde. You see, people are best at anticipating what they expect - especially in situations that breed familiarity. That's the reason why Value at Risk (VAR) was such a seductively attractive formula - in a largely pro-cyclical business culture, a formula that helps you anticipate what you expect (that today will look mostly like tomorrow, yesterday and the day before) is a winner. People who anticipate other outcomes are either brilliant visionaries who make "discoveries" (minority), or outliers who make trouble (majority).
I began the year thinking that financial regulatory authorities could make better policy decisions if they had the right data. But I now understand that many of them had the right data in 2005, 6, and even 7 but they didn't understand it, chose to ignore it, or lacked the political will to make radical, outlier, decisions that would adversely effect many key constituencies.
Hence my conclusion: Data Governance isn't enough. Collecting and aggregating data is an important step, but people need to understand what the data means as information, and that information needs to be communicated widely as knowledge. Not the finite biological knowledge we all have in our brains - the organic translation all of you reading this article are performing right now - but the metaphysical knowledge of a community knowing a common truth about the world so they are prepared to accept a decision to avoid an outcome they did not expect.
I don't care what kind of new Systemic Risk Council gets built at the Federal level of our government, or indeed what kind of new Regulatory Information Architecture is designed to support it. All of that is important but not as important as the steps people take to disseminate the information in both raw and interpreted form to a wide and varied constituency. The more people inside and outside the group that know what the group knows the better chance we have that outliers will interpret things the group will miss. And it is upon those outliers - the ones who anticipate what we don't expect - that crisis prevention most rests upon.
This last point is the hardest. In the financial crisis, only a few economists like Nouriel Roubini predicted the credit crisis before it began. Most of the other economists predicted it perfectly only in hindsight. But Nouriel was largely ignored by those economists and the media as "Dr. Doom," the naysayer who only saw the bad while so much good was going on. And that is human nature. If you aren't in the tribe of believers you are a barbarian, an outsider, who can't be trusted and must be demonized or destroyed.
This is of course very bad for the discovery of non-expected results, unless of course you ARE
a barbarian trying to hack your way into the group in which case you should be destroyed. Trusting what you know, where it came from, where its going, and who's going to know it and do something about it will require new forms of transparency and self-governance. George Orwell wrote about the alternative, and we don't need to follow his example.
Because what we want is Trusted Information that empowers Doubt. Doubt about what information means is essential to effective decision making. And this is where I think a new Information Governance discipline, one that focuses on the Information needs of Governance as well as the challenges of Governing the use of Information is needed.
That's at least what I learned from my son on the tennis court last week. We'll see what he teaches me today.
This morning, EU Regulators announced that they propose to create a Risk Board to monitor financial market performance and systemic risk indicators among the 27 member nations in the European Union. I've advocated a Council approach to risk-based decision-making since the beginning of this year and I think the EU proposal is a good idea in concept. Unfortunately, in Europe it seems decision-making takes a large number of people, becaue the European proposal would have 63 people participating on the Risk Board. A deliberative body with 63 people is not a "Board" - it is a legislature. To complicate matters, "only" 32 members of this board would have voting rights. Unfortunately, the only power they can vote on is a warning to member states that some part of their market performance contains systemic risk. How they plan to determine that threat and get everyone to agree on what it means in any reasonable amount of time is not clear. My guess is that this is a proposal to setup an intra-governmental think-tank that will study issues, write economic reports that no one reads, and only threaten to issue warnings because a vote on a warning will never happen.
Note to Obama Administration: If you want to create a Systemic Risk Regulatory Structure that is guaranteed to fail due to political indecision and lack of authority, copy the EU model.
On September 14, David Bogoslaw published an article in BusinessWeek
entitled "How Banks Should Manage Risk." Rick Bookstaber and I are
quoted in this article because we first had an interview with David
following the XBRL Risk Taxonomy Meeting I hosted at the Levin
Institute in New York on May 13, and we had follow-up interviews two
weeks ago. As is the case in any press interviews, some of what you
say gets printed and a lot doesn't. In this case, I think much of the
substance of what I told David was out of scope for the BusinessWeek
audience and the goals of his article.
terms of a banking audience, David gets it all right, and I agree with
Rick Bookstaber's comments too. But what the article omits is the fact
that from 1999 to July of 2008 the US Congress, the White House, FHA,
the SEC, and the US Federal Reserve all participated in an
industry-backed weakening of the financial regulatory framework that
was built in the 1930's. In 1999, The Financial Services Modernization
Act (named Gramm-Leach-Bliley, or GLBA for short, after its authors)
removed 70 year restrictions on bank, investment bank, and insurance
cross-ownership. At the same time, derivative market oversight was
specifically excluded from GLBA and financial markets were allowed to
create and trade complex derivative instruments without regulatory
reporting or control.
In 2001, President Bush exhorted
Americans to "go shopping" to support the US economy following 9/11 and
the Federal Reserve obliged by cutting interest rates down to 1% to
pump liquidity into the US market. In 2004, Congress lobbied Fannie
Mae and Freddie Mac to relax underwriting guidelines on home loans to
allow sub-prime borrowers to participate in "The American Dream," and
own a home, and FHA provided loans subsidies to make it easier. In
2006, Congress pressured the same GSE's to relax underwriting on Alt-A
mortgages, allowing self-employed individuals to declare their income
with a signed affadavit instead of documenting their income through tax
filings. As I've written in past blogs, that change gave license to
mortgage fraud across the country as Alt-A borrowers could make wild
income declarations without validation and that pumped tens of
thousands of fraudulent mortgages into the global financial system.
This change wasn't reversed until July 2008, when the Federal Reserve
finaly changed Alt-A underwriting guidelines. The long tail of the bad
mortgages underwritten from 2006 to 2008 mean we will suffer
significant foreclosure rates welll into 2011, extending the depth and
breadth of this recession.
2006 proved to be the top of the
Housing Market in terms of house valuations and bank fees generated
from loan securitization and derivative markup. The pile-on
legislation and market encouragement from Congress, the White House,
and the Federal Reserve came from industry pressure to keep the party
going as long as possible.
Yes, Banks took on too much risk
from 2001 to 2007. But the US Government encouraged and enabled
excessive risk taking during that period, and both need to be monitored
to prevent future crises. There is an inherent conflict of interest in expecting the government that enabled the current credit crises to participate in the forecasting and prevention of the next one.
There is a history of financial
de-regulation followed by marked innovation and crash that goes back
100 years in the US. The innovation generates enormous wealth on Wall
Street and new tax revenues for Federal, State, and Local Governments.
The relationship between government enablement and financial innovation
was omitted in David's account and needs closer scrutiny because
policy-makers, and the public, will need new information management tools to realize the
impact of incremental policy decisions on financial market performance
over the longer term to be able to regulate wisely in the future.
the article, I recommended that the government create a new Regulatory
Information Architecture, modeled on the Information Sharing Councils
created by the Bush Administration for terrorism intelligence gathering
following the 9/11 Commission Report and the Intelligence Reform and
Prevention of Terrorism Act (IRTPA) of 2004. But more is needed.
year ago, I believed that new information technology and data
collection would enable the US Government to better analyze the
performance of financial markets and forecast potential bubbles and
crisis. I'm sure that enhanced information sharing in the US
Government will enable better regulatory enforcement, but it's not
enough to prevent future crises. The public needs to play a role in
the oversight process because the Government has its own interests
which are not always perfectly aligned with those of the public.
Administrations change, and with those changes come new philosophies of
governing and regulation, and in a Democracy like ours you always want
to enable others to regard and report information that others disregard
Therefore, what's needed is more information
transparency about market holdings and the actions of market
participants so that anyone in any firm, university, or industry
watchdog can analyze nearly the same macro and micro economic data that
federal regulators observe and make their own forecasts and
Without public access to better market data, we are just enabling government to encourage risk taking more efficiently in the future.
You can read the businessweek article here: http://www.businessweek.com/print/investor/content/sep2009/pi20090914_336015.htm
Data=Information=Knowledge. Or so we would like to say. In theory, data is unorganized information, and knowledge is information put to use by human beings. But theories are for academics. And this theory is super convenient if semantic consistency is important. There are Data Architects who only think about data in databases, Information and Content Architects who only work with unstructured repositories, and even Knowledge Architects who I suppose work with information taken out of human brains and put into... structured or unstructured repositories on computers...
In real life, in real companies, these are artificial distinctions. Organizations want to control data/information supply chains because they are full of quality control problems, security vulnerabilities, compliance challenges, and operational exposures. Those risks imperil decision-making, increase operational costs, and reduce revenue opportunities. Quality control and risk mitigation are challenges for every data type.
Five years ago, "Data Governance" seemed like a great name for a new discipline to help transform organizational behavior from vertical to horizontal; because information is transformational. What we meant then and mean now is not just about "Data" in the purest structured sense. We mean Data in the most plural and unlimited sense. People want to govern other people's use of all kinds of information in every form.
No data stovepipes please! We need Data Governance Solutions for all human uses of information regardless of their form or structure, use or abuse.
Anyone who tells you different is just so 20th Century...
On October 7-9, I will be hosting a conference on The Future of Data Governance at the Mohonk Mountain House (www.mohonk.com) in New Paltz, NY. This event has been designed to explore the challenges and solutions of Data Governance organizations constantly ask about:
1. How do I transform data into an asset? Data isn't an asset until you make it one, and its not an asset like gold, stocks, or oil. Those assets have commodity values based on their scarcity and demand. Data is an asset with infinite availability, so its value can't be based on the amount you own or the amount someone wants. The value of data is purely perceptional, unless there is a market for that data. iTunes, DVDs, Newspapers, and cable TV are all examples of data with values based on market demand through external sales channels.
But internally, we have no market for data sales. So the best we can do within an enterprise is increase the perceptional value of data as an asset. It has a perceptional value to Business when IT can demonstrate incremental revenue obtained through data consolidation, aggregation, cleansing business intelligence, and new sales.
Your data either is producing new revenue or it isn't. But when it is, getting business and operations to take notice and care about how the uses of this data are to be governed is easy. At this conference, we'll hear from customers who are both struggling with these issues and also those who have solved them. And I think we'll see that there are indeed best practices in working horizontally with Trusted Information that is a cause celebre for governance.
2. What are the risks to data assets everyone in the organization should be aware of? There are so many risks and liabilities from working with data today. We read about data breaches, privacy violations, and compliance challenges so often we become inured to the issues. But when Data becomes a perceived asset in your organization, knowing which risks to mitigate, avoid, or transfer out is critically important. Because no one has infinite resources to protect against every exposure, new methods in risk calculation, embedded deep in business processes and decision-making, are needed. And risk calculation can only take place when past mistakes and losses are accurately recorded, trended over time, and integrated into BI applications.
At the conference, we will explore the increased scrutiny that risk is getting and some of the best practices available in risk calculation, risk taxonomies, and forecasting solutions. We'll hear from customers with real use cases and experiences, as well as some vendors with exciting new solutions.
3. Organizationally, how do we govern the use of data assets and protect against risks? Data is unorganized Information, and Knowledge is information digested by a human being. Data itself can't be governed. It is inert until organized into information and transformed by a person into knowledge. A person can create data and information assets or put them at risk, so therefore only a person can be governed. Governance is a political process for organizing behavior to achieve certain goals.
Data Governance can be called other things, but the political organization can't succeed without x-organizational support. Just as we seek to create information assets by overcoming data stovepipes, so too do we need to overcome organizational stovepipes and link Business, Operations, and IT to achieve Data Governance goals.
Many organizations in the Data Governance Council have been successful in creating information assets, protecting them from risks, and organizing x-functional participation in Data Governance Councils. And they have achieved some stunning results.
Five years ago, Mohonk was the venue where I hosted our very first Data Governance event. Back then we organized three tracks to focus on Policy, Content, and Infrastructure questions. We had a lot of questions and ran each track as an interactive forum to frame common issues, understand the dimension of Data Governance, and identify convergent areas our customers wanted to explore. We had long discussions about data supply chains, policies and rules, metadata and data classification, security and risk. The dialog was extremely interactive, and coming out of that meeting there were many who wanted to continue. That was the genesis for the IBM Data Governance Council.
We knew then that Data Governance would become an important field. Some early visionaries like Robert Garigue from Bell Canada, Christa Menke-Suedbeck from Deutsche Bank, Charlie Miller from Merrill Lynch, Ed Keck from Key Bank, and Richard Livesley from Bank of Montreal helped us all to see the dimensions of the emergent market. And it was those leaders who helped to shape the Data Governance Council Maturity Model, which in turn helped define the elements of the Data Governance marketplace.
Of course, what we couldn't see then is how failures in Data Governance would threaten the world economy itself. The Credit Crisis was caused by incremental policy failures in almost every stage of the mortgage data supply chain. Loose credit led to bad home loan underwriting decisions, which were masked by rising home values. Huge fees in MBS and CDO trading led to inside-deals with credit rating agencies and banks and vast amounts of poorly documented mortgages came to be regarded as Tier 1 assets on many balance sheets around the world. These instruments were insured by complex derivatives traded without clearinghouses and created interconnected obligations among the largest banks with huge exposures should any one of them fail.
The media has focused on the wide segment of the funnel, the derivative market failure. Credit Default Swaps in this market had a notional market exposure exceeding $100 trillion. But the failure was within a supply chain and poor underwriting standards in loan origination from 2005 to 2008 continue to pollute banks with Toxic Assets and the long tail of mortgage foreclosure haunts our economy. Our mortgage market remains heavily discredited around the world and new Data Governance solutions are needed to restore investor confidence in the US Mortgage Market.
I've been working with a range of policy-makers and thought leaders on providing concrete solutions to those challenges, and I will host a round-table discussion on US Housing Data as a use case example on the value of data, the terrible risks that can still plague our economy from data pollution in that supply chain, and the concrete steps that can be taken now to address these issues.
I think this conference will be thought provoking and practical. The market is looking for Data Governance solutions. Not just know-how and not just software. But know-how and software and examples how to apply them. That's what we'll do and I hope you can join us. I think it will be the best Data Governance Conference ever. The venue is fantastic, the room rate unbelievable, and the conference fee is a true bargain.
This agenda will continue to evolve, so come back often for updates.
Directions to Mohonk