On Saturday, I sat with an old friend at a secluded restaurant on a grassy river bank North of Bangkok. We are both actively engaged in the banking industry as observers, speakers, and peripheral participants. My friend has a more direct engagement with a Thai Bank but still as an adopted outsider. Lunch was excellent, and we sat on a wooden pier just feet from the river's edge as barges, trawlers, and all manner of ships slowly passed by with and against the current. A pair of large floor fans blew hot air our way and an umbrella shaded us from the searing sun playing tag with the clouds above. The heat in Thailand is soft, enveloping, pervasive, and quietly oppressive. You have no hope of resisting its dictatorship. Somehow the Thais have developed a sweating immunity to their own condition, whereas this Western visitor is deficient in that regard.
During lunch we compared current events in both Thailand, where the Red Shirts have barricaded themselves behind sharpened bamboo poles and tires doused with gasoline. Their encampment was many miles from our lunch spot, and indeed encompasses but a small corner of the entire city of Bangkok. Yet their determination to resist the current government, who themselves are only in power due to a similar incident involving a Yellow Shirt protest two years ago, has driven away western tourists and continues to cause confusion and insecurity in the highest elements of Thai society. And we discussed the Credit Crisis, Greek Debt, US Politics, and Regulatory Reform.
On Greek Debt, we discussed how the former Greek government hid the massive debt it had accumulated from EU Regulators (reporting a deficit of only 3.5% each year instead of the 12% it actually was accumulating), and how this massive amount came to light only with a change in government - when one group had an interest in reporting the bad data another group had an interest in hiding. Most today call this an act of Fraud, but it also has to be admitted that it was not just the former Greek government who had an interest in hiding their debt. The Germans, French, Belgians, and perhaps even the European Central Bank had an interest in ignoring the reality of Greek economic underdevelopment and overextension.
The data about Greek debt was available. Greece can't borrow on the black market. Their debt has to be issued in
bond markets, and the amounts, yield, and maturity dates are all public
record. Bond markets are largely transparent. But Transparency creates its own information asymmetries. First, the availability of information doesn't mean everyone collects the same amounts, has the power to use it, or knows what it means. Second, there is a private sector deference to public sector data aggregation, analysis, and reporting, and the public sector relies on static information reporting programs that limit source authentication, audit, and repudiation. These two behaviors allowed the Greek Government to report fraudulent deficit figures to the EU and the EU didn't bother to verify that information against publicly available market data.
One could argue that the construction and expansion of the EU Common Currency without adequate audit powers created an environment rife for fraud, but this is too easy. EU regulators could have at any time used data from bond markets to verify Greek debt. Why the EU didn't monitor the discrepancy between public reports and private market data has more to do with EU politics than Data Governance.
Every government is comprised of politicians who owe their hold on power to public perception. Everyone in Europe played See No Evil, Hear No Evil, Speak No Evil on the subject of emerging market debt in the EU. The information was available. Net inflows of financing and debt accumulation can be gained by studying the bond markets. Public obligations in Greece are also no secret. Everyone in Europe knew that pension guarantees starting at age 50 in Greece were a ridiculous luxury in a country with such low productivity and wages.
Transparency and Reporting do not, in themselves, guarantee that anyone is using or validating information sources correctly. Every report needs to be validated with external sources, because Transparency is not the same as the Truth. If the EU wants to fix this structural problem in its own multi-nation confederation, it will need to create an independent auditor, like the US Government Accountability Office, whose role it is to audit member programs and reports, to discover waste and abuse.
All reported data must be verified. If we didn't learn this in the Mortgage Credit Crisis, now is the time to take it home in the Sovereign Credit Crisis.
Banks, Hedge Funds, and other investment institutions should not wait for the EU and other governments worldwide to get the audit role right. They should build their own Information Analytics programs to validate the assertions of governments as well as listed companies because what Greece did is not new. Fraud is a part of business.
Data Validation should be seen as an important part of Market and Credit Risk Measurement and Mitigation programs. This is where Data Governance and Risk Management intersect, and new technologies will be needed to make reporting aggregation and analysis easier and faster.
On the river, in Bangkok, I asked my friend if his bank monitored the market and credit activities of their Thai competitors. They do not. They expect the government to collect data from every bank, aggregate and report that to the banking community. And his bank reads those reports. I would argue that the events of the last three years clearly demonstrate that governments are not well equipped to be doing primary market data analysis on behalf of themselves or any industry. They lack the technology infrastructure and the analytical skill to make intelligent use of the data the market already provides and their political dependencies create natural conflicts of interest.
Businesses must perform their own due diligence to verify government reports and conduct primary market data analysis of every potential investment opportunity.
Unverified data should not be trusted. This is Data Governance Rule #1.
While academics contort over the rise of successful bank lobbying on Capital Hill, Jack Reed has introduced the Rating Accountability and Transparency Enhancement (RATE) Act of 2009
, which "would provide new oversight and transparency rules for Credit Rating Agencies." This is a serious bill with excellent ideas that will do more to correct one area of abuse in the credit crisis than many other current proposals. Credit Rating should be transparent so that market participants can validate rating methods and the SEC can provide oversight and audit over problems and failures.
RATE includes further strengthening of existing regulatory structures, with new authorities provided to the SEC. But the important component here is new rating disclosure requirements which would make the methods credit rating agencies use to rate bonds, MBS, CDO, and other derivatives transparent and auditable. I also like the proposal for a new independent Compliance Officer, which is a power long overdue in ALL corporations.SUMMARY: The Rating Accountability and Transparency Enhancement (RATE) Act of 2009 (http://reed.senate.gov/newsroom/details.cfm?id=313172)
bill strengthens the Securities and Exchange Commission’s (SEC)
oversight of Nationally Recognized Statistical Rating Organizations
(NRSROs) through enhanced disclosure and improved oversight of
conflicts of interest, and makes credit rating firms more accountable
through greater legal liability.
Accountability of NRSROs
NRSROs liable when it can be proved that they knowingly failed to
review factual elements for determining a rating based on their
methodology or failed to reasonably verify that factual information.
the SEC to explore alternative means of NRSRO compensation, and
requires a Government Accountability Office study on payment methods,
in order to create incentives for greater accuracy.
• Establishes an office in the SEC to coordinate activities for regulating NRSROs.
the SEC to ensure that NRSRO methodologies follow internal NRSRO
guidelines and requirements for accuracy and freedom from conflicts of
Due Diligence Certification
certification if due diligence services are used to ensure that
appropriate and comprehensive information was received by the NRSRO for
an accurate rating.
NRSROs to notify users when model or methodology changes occur that
could impact the rating, and to apply the changes to the rating
• Requires the SEC to establish a form for NRSROs to
provide disclosures on ratings, including methodological assumptions,
fees collected from the issuer, and factors that could change the
• Requires NRSROs to provide rating performance information, such as information on the frequency of rating changes over time.
Conflicts of Interests
NRSROs to have an independent compliance officer to manage conflicts of
interest and independently review policies and procedures governing
ratings so they are free from conflicts.
• Requires the SEC to regularly review NRSRO conflict of interest guidelines.
a look-back provision requiring that if an NRSRO employee later becomes
employed by an issuer, the NRSRO must review any ratings that the
employee participated in over the previous year to identify and remedy
any conflicts of interest; and provides for SEC reviews of NRSRO
look-back policies and their implementation.
I see this bill as another indication that financial regulatory reform will fix underlaps and gaps in existing authority rather than build a new systemic risk regulatory institution.
On February 27-29, I hosted the 15th meeting of the Data Governance Council at the Wales Hotel, in New York City. 31 people registered to attend this meeting, including 16 IBMers, and representatives from JPMC, Bank of Tokyo/Mitsubishi, Bank of Montreal, Key Bank, State Street, MasterCard, and American Express, OpenPages, Axentis, Varonis, and Vericept.
On the first day, we had excellent keynote presentations from Garrick Utley, President of the Levin Institute, and Will Pelgrin, Director of the NYS Cybercrime Taskforce. We also had some good roundtable discussions on common challenges in Data Governance related to Sub-prime, Basel II, and other issues. On the second day, we continued discussing common challenges and reviewed IBM Data Governance Solutions with regards to Policy and Process Management, Data Modeling and Development, MDM, Metadata and Data Quality Management. On the last day, we left the agenda and had a long discussion on the future of the Council. Cal Braunstein rounded out the event with an excellent closing keynote on the risks to and from Data, and the risks to organizations from data we can't trust.
We spent a lot of time talking about Globalization and it's effects on competition, regulation, cybercrime, and risk. Globalization is having a corrosive effect on trust in many organizations. Pressure from regulations requiring oversight and reporting of employee use of IT increases distrust at all levels. Cybercrime and the increasing financial value of data challenges everyone with offers and scams that make it hard to trust information. These factors are creating internal crises in trust and confidence. The manipulation and monitoring of information technology by people over other people threatens the quality and value of decision-making at a time when global competition brutally punishes bad decisions.
The Globalization of threats, risk, regulation, and competition will immediately force organizational decision-making inward, towards hierarchical models of decision-making, even as the globalization of markets, labor and resource allocation forces more horizontal changes in culture, lifestyle, and freedom.
This Council has existed for three years, and many members, by virtue of their participation, have achieved more mature levels of Data Governance. They have cross organizational governance models, better transparency and better decision-making. Many newer members are just now exploring organizational models, business vs IT participation, the nature of Stewardship and the complexities of overcoming organizational stovepipes.
Enclosed are my notes and observations from this landmark meeting:
1. Data Governance Market Maturity: Data Governance as a market is maturing from the Innovator phase, where a few leading companies worked together to blaze a trail for others to follow, to the early adopter phase. We are clearly seeing some leading companies succeed with Data Governance, thanks in part to the Data Governance Maturity Model, and many many more now coming into this market looking to build on the success and experience of the innovators.
For those of us pioneers, this is a time of change, and we must adapt to a new market constituency requiring education and solutions with somewhat less tolerance for discovery and invention. The Data Governance Starter's Guide should be updated as an educational onboarding tutorial for new companies seeking Data Governance success. For vendors, this is a time to study solution packaging and focus on the support needs of the stewardship community. Stewardship is a profession still in its infancy, and it requires practitioner tools, education, and community forums to exchange practices and success stories.
We should all be proud that our contributions have move the market to this new phase, and the Council needs to change to grow with the Market.
2. IBM Data Governance Solutions: IBM has come a long way in its Data Governance Solution capabilities since 2006, which was the last time we had a major showcase of technologies on the Council Agenda. Most of our solutions - Compliance Warehouse, Integrated Data Management, MDM and Industry Models, Data Quality and Metadata tools - were very well received. But this Council has succeeded exactly because it is not a normal IBM Customer Advisory Board, where normal meetings are dominated by IBM solution exhibitions. Rather, it has succeeded as a unique forum for practitioner exchanges, and it must remain this way to continue.
Future meetings will be shorter, practitioner driven, and IBM will find additional venues to present Data Governance solutions.
3. Globalization: At Mohonk in 2004, at the inaugural Data Governance Summit, I presented some ideas about how information technology would transform the modern corporation, and how integral Data Governance would be to that process. I was heavily influenced by Tom Malone and his book the Future of Work, and also by the history of industrial regulation at the dawn of the 20th Century.
In NY, we re-examined some of these topics through presentations from Garrick Utley, Will Pelgrin, and Cal Braunstein, and I think we need to continue examining how the global pressures on information technology, regulation, cybercrime, risk, and transparency will impact Data Governance and organizational behavior. Many companies that have embraced Data Governance have stopped short of embracing x-organizational governance bodies with real authority. Most don't know which models to follow, examples of success to emulate, how it should work.
In my travels I've seen many governance models in corporate and national entities that offer some hope to modern organizations, and I think we ought to be the Council that inventories these models, compares their pros and cons, and presents alternatives to hierarchical organization.
4. Data Risk Standards: In the Xiao Dynasty in China, rulers practiced Risk-based decision making by consulting an Oracle, who dropped an Oxen hip bone on the floor and deciphered the direction of the crack in the bone as indicative of divine truth. If the crack pointed up, you had good favor for your decision, down, well you better ask again. People consulted the Oracle on every kind of decision - dental surgery, marital options, taxation, or war - and they would drop 6-9 ox bones and average the results, thinking that more data would provide more accurate results. Every question to the Oracle was journalized, and outcomes were constantly compared to the ox-bone forecasts. Records of these inquiries survive today, providing the oldest known risk forecasting models. Three thousand years ago, this was the first form of risk-based decision making, and while it may seem primitive to us it was at least systemic which is more than we can say about ERM practices today.
Enterprise Risk Management today is still a voodoo art practiced by a secret society of Risk Managers in a language few understand. It is expensive, bespoke, non-standard, and under-utilized. Market, Credit, or Operational Risk consequences are not understood by the vast majority of employees who make enterprise decisions because none of them have access to even Oxen bones today, let alone risk-based forecasting models that allow decision makers to compare options, forecast outcomes, and compare results to the forecasts.
To get to that state, where ERM is a common discipline that every employee can use for enlightened decision-making, new Data Risk standards are needed, to make ERM simpler, cheaper, and more systemically repeatable, and that is another contribution this Council can make. We will next meet on June 26th at the Federal Reserve in Washington, DC to explore that opportunity in depth.
What was evident at this meeting is that Data Governance challenges have changed in three years. We are still at the cusp of changes in the way modern, post-industrial, organizations are governed. Even the most mature members of the Data Governance Council have not substantially changed the way their organizations perform decision-making. It is still top-down, barely delegated, with little or no trust extending from the top to the bottom of an organization. Many governance bodies or teams have little or no direct decision-making authority - neither funding mandates nor project veto powers. The light of information still shines brightest from the bottom-up, with those at the top getting the best view of the light and those at the bottom simply blinded by it.
We need new models of organizational governance, new data standards in ERM, and renewed investment in risk-based decision making at all enterprise levels. This remains the challenge of Data Governance in the early adopter market evolution.[Read More
It is that simple.
You have supply chains that deliver toys from manufacturers in China to sit under Christmas trees in Canada, oil and gas from Russia to factories and homes in Germany, Diamonds from mines in Namibia to jewelers in New York.
Real World supply chains keep the global Industrial Economy running.
Alongside, you have Information Supply Chains that deliver crop yields to traders on the Chicago Mercantile exchange, raw video footage from journalists in Afghanistan to news desks London, Paris, and Atlanta, and sales performance reports from branch offices in Omaha to main offices in Arkansas.
Around the world, Information Supply Chains drive the Knowledge Economy.
They need to be Smart - Instrumented, Monitored, Measured, and Coordinated. And we need to be aware of when they are designed, what flows through them, and how we can improve them.
Without awareness, Governance itself can never be very Smart. It is that simple.
The two most historically important developments of the last two decades are the growth of global markets and the speed of information technology development. Markets and IT are transforming the world at a faster rate than any other developments in human history. And they are also challenging Governance Models in ways that are equally profound. Kings and Crowds fought it out politically at the dawn of the 20th Century when the ancient Russian, Chinese, Austrian, and Ottoman empires fell, and they are battling commercially today in many markets in which Crowds are winning again.
1. Product Development
I'm an audiophile. I buy expensive audio equipment in the hope of reproducing an emotional connection with music in my home that people feel when they attend a live concert. Being on a limited budget, I'm also a cheap audiophile. I like the best product for the lowest cost, which is one reason I applaud globalization. Over the last decade, high quality, low cost audiophile equipment has been coming out of China that rivals the best high cost gear manufactured in North America and Europe. Some companies have setup local design and Chinese manufacturing with online distribution that brings incredible bargains to mainstream US and European consumers. Two such companies are Oppo Digital and Emotiva.
Oppo makes DVD and Bluray players that are designed in San Francisco and manufactured in China. I've owned their products for several years and am always impressed with their price/performance ratio. But now I'm even more impressed with their product development process. In 2008, they announced the development of a new Bluray player, the BPD-83. These days, consumer electronics are more like computers than audio equipment, with complex Digital Signal Processors, graphics chips and CPU's interacting in intricate designs. Oppo knew product development would be difficult and testing even more so. With the complexity of hardware and software in one appliance it is really difficult for a small team of product designers and marketing professionals in San Francisco to test against every potential usage scenario. And when manufacturing is outsourced to China it is even harder. Distance, language, and culture create barriers that make communication a new challenge.
In this environment, Oppo decided to outsource product testing to its customers by using a Crowdsourcing solution. Several hundred customers received pre-production units of the BluRay player and tested it in their homes. Their product feedback went to the design team who translated feedback into design changes for the manufacturer. The Crowd were given the option to vote on final product readiness. The first vote sent the product back for more changes and fixes in late 2008 and the second vote in Spring 2009 released it for GA in June.
I bought the product in July 2009 and it is superb. I contrast this to Emotiva, which is also a small design team based in Tennessee that manufactures in China. They make outstanding AV amplifiers, speakers, processors, and other equipment. In 2007, Emotiva announced a new AV processor, the UMC-1, for delivery in 2008. That slipped to early 2009, when it was announced that the product would ship in June. In July, the company announced it discovered bugs in the production units from China and would need a couple of months to fix them. By October, more than a couple of months went by and customers were fuming on the company's forums about the delays and the poor communication. In November, the company announced it would begin shipping to the pre-order list and many customers anticipated units before Thanksgiving. By early December, no units had shipped and the company had to start censoring its Forum because customer rants were getting abusive. The Emotiva CEO promised some customers would receive their units by Christmas, and when that didn't materialize many Forum members started talking about buying alternatives.
Last week, Emotiva finally began shipping a handful of units to pre-order customers without manuals. The first reviews appeared over the weekend and talked about stunning video quality but also a few audio and connectivity glitches. The CEO posted a very nice note on the Forum describing the company's pride in the product but also that a firmware release would soon be forthcoming.
So what this company did was use its customers for an unannounced Beta Testing program. They shipped their product very late to market, after a year of inconsistent market communication, with bugs they were probably aware of but couldn't fix without suffering more brand damage.
Contrast the two companies. Oppo used a market based Crowdsourcing mechanism to recruit customers to beta test the new product. The customers who participated in the testing provided open feedback which was visible to all members of the company forum. They fixed bugs quickly and used customers to determine when the product was ready for shipment. That process created customer loyalty and ensured a bug-free product that shipped only six months late. Emotiva used a hierarchical mechanism of in-house testing and opaque customer communication to ship a product more than 18 months late and filled with bugs that alienated customers and reduced brand loyalty.
Some people might say these companies have different approaches to product development or customer service. I abstract these situations as examples of governance models in complex social systems. Oppo used a market-based governance (coordination and cooperation) model and succeeded in satisfying the needs and interests of its market participants. Needs and wants are at a primary market level. Feedback Information about the product are at a secondary market level. Emotiva used a hierarchical governance model (command and control) and failed to satisfy secondary market interests in information and primary market needs for products.
This doesn't mean that market mechanisms always trump hierarchical control. But when a small number of people are trying to govern complex systems for consistent outcomes, a market-based model can be more efficient and produce better results.
2. Cost Containment.
My boss sent me a note over the weekend reminding me to use our ATT Calling Card from land lines when I am travelling abroad. It seems my cell phone bill in November was higher than the accounting police think necessary. All calls above $100 qualify for an immediate audit. Its not clear from my bill if any of my calls were or could be audited, but my boss, who is altogether a terrific guy, wants me to avoid that root canal and work smart abroad. Being a Governance Guy I do have to question the intelligence of a governance system that controls costs through managerial oversight of cell phone bills and automatic audits for $100 calls.
If there is already a trigger for automatic audit at the $100 per call threshold then someone has already noticed a pattern of calls that exceed $100. That kind of pattern calls for a policy change, but automatic audits require a fair degree of manual labor - both from my boss and the auditors. Wouldn't it be far Smarter to develop policies that cause the cell phone users themselves to police their own usage by giving them alternative means to reduce costs?
Some might argue that the warning note from my boss is a policy tool being used to change my behavior. But because the billing system is deliberately opaque in IBM, it isn't possible for me to evaluate the impact of each of my calls on the overall phone bill I incur each month. I can't see the incremental impact of my decisions as Risks to The System as a whole.
A more intelligent approach to cost containment in this case would be to toss the issue out to the Crowd of cell phone users in IBM and get them to come up with ideas to mitigate costs for each user. That process would include users in the decision-making process, getting them to brainstorm ways to reduce costs instead of treating them like cost creators in a hierarchical model to impose control.
Like, shouldn't IBM have Skype strategy for global travelers who make calls in cars and trains so that productivity isn't imperiled while costs are contained?
Crowds and Kings. What do you think? Post a comment and let me know.
Please join us for an international crowdsourcing experience!
In May 2006, the IBM Data Governance Council used poster board and sticky notes in an oak paneled room in the Chateau Frontenac in Quebec City to create the categories, elements, and levels in the first version of the Maturity Model. About 35 people
participated in that process in Quebec, and perhaps another 50 more in subsequence meetings.
On September 14-16 2010, the Council will use social networking crowdsourcing technology to include a global community in a discussion about the Maturity Model - Live!
Suggestions and comments from practitioners all around the world will be relayed to the participants in the room.
Of course, this venue is awesome, and there is no substitute for live, face to face, communication. But if you can't travel to Tamaya, and spend three fabulous days with The Council in the Desert, you can still tune into the action by going to infogovcommunity.com.
In the room or in Rangoon, you can watch the ideas flow and chime in live or tune in later and add your views.
Either way, what you contribute will impact the community and change the Maturity Model. Synchronous or Asynchronous, this meeting is the beginning of a global dialog on Data Governance Maturity.
What we do in the room will make a difference. And what you contribute from your own room will make a difference.
Please join us in Tamaya or online at www.infogovcommunity.com to capture the best ideas from the Global Information Governance Community, contributed for the Community and published in an open-sourced IBM Data Governance Council Maturity Model.
This is how we innovate!
Steven B. Adler
IBM Data Governance Council
Two years ago, I met Helmut Willke, the author of Smart Governance: Governing the Global Knowledge Society, at a hotel cafe near the great cathedral of Cologne. Professor
Willke is a sociologist who teaches Global Governance at the Zeppelin
University in Friedrichshafen, Germany. Late in 2009 I became
interested in Governance as a system of decision-making and Professor
Willke had written an excellent book exploring this topic. While the
Professor is German, he writes extremely well in English and his book
very well written and insightful. Like a lot of philosophical texts, it
is not an easy read. Dense descriptions, long sentences, and theory
backed by ample example make it a book you have to read at least twice
to fully comprehend.
I was in Cologne in late February 2010 to meet the CIO of the City and attend Rosenmontag at City Hall
. I had already seen several days of Karnival, with the endless parades, costumes
and candy strewn about the streets. For five or six days in February,
the staid and reserved city of Cologne becomes an endless drunken party
attracting visitors from all over the world who wear outrageous costumes
and march in parades on incredible floats and throw candy to the
bystanders. Its unlike any parade I have ever seen. Quite amazing.
It had snowed a lot that year. It was white from Brussels to Berlin,
and Cologne was still covered by eight inches. The square in front of
the Dom was clear, and I had spent the morning before our meeting
visiting the Roman museum across the square. Cologne is an ancient
Roman city and the ruins are collected in a fantastic museum right next
to the Dom. Of course there are columns and pediments, but also beautiful mosaic floors, jewellery, stained glass,
and decorative arts. There is a model of the Roman city and you can
see how the Germans built the city on the same street grid with walls
built on top of the Roman walls. Of course, much of this was destroyed
by allied bombs in WWII, but some remnants remain.
Looking back at Roman colonial rule of Cologne was an excellent
introduction to the systemic ideas of Governance Professor Willke and I
discussed over coffee that afternoon. He is not a tall man, mostly grey
late-50′s I would say, with bright blue eyes. He makes an immediate
impression, and is passionate about his book. I had used the book as
text for a class I taught at the Bucerius Law School on Data Governance
in Hamburg that January. My students did not entirely appreciate the
dense prose and abstract ideas, but through class conversation we did
ultimately appreciate the idea that Governance is a system of
decision-making that could be described and modelled. And we used
Social Networking metaphors to explore the idea of policy-making, human
behaviours in a system of Governance, and how to model potential
outcomes. Of course there is political science, which describes
political models of Governance – Democracy, Dictatorship, Monarchy, etc –
but what is unique and important about Professor Willke’s book is the
application of systems theory to Governance.
We had some coffee and talked mostly about how the Professor wrote
the book and why. As I had in 2007-8, the Professor had used the Global
Credit Crisis as a use case to describe failures in Governance. I had
covered this topic from a Data Governance perspective, arguing that
hundreds of incremental failures in business processes and data quality
had produced a domino effect that plunged the global economy into
Depression. He covered the topic from a decision-making perspective,
and while we approached this topic from different directions we arrived
at similar conclusions – policy-makers can’t possibly make the best
decisions without understanding the consequences of those decisions on
incredibly complex and interconnected global systems. And those
consequences are impossible to understand without new information
systems that render the complexity with software and illustrate how the
policies will be accepted and resisted.
In my class at Bucerius, my students complained that the Professor
had not done enough to provide solutions to the problems he had
identified, or that his solutions were too abstract. I presented these
criticisms to him at our meeting and he responded that it was not
possible to offer concrete solutions because every systemic problem
needs to be modelled to understand the variables and outcomes – that
there is no one size fits all. At the time, I thought this was a
dodge. It took me a few more years to understand that he was right.
There are no Governance Solutions that can auto-magically produce the
best outcomes for every decision. But it is possible for policy-makers
to use systems theory and software to construct decision-making models
that can plot many of the actors, objects, variables, and potential
outcomes to understand the impact of policies on complex systems made up
of hundreds, thousands, and even millions of human beings with unique
After my course, I synthesised concepts from the book with ideas from my students to create the Six Steps to Smart Governance.
It’s not meant to be a Framework. Frameworks and models are nice tools
to help people feel more secure about challenges they seek to overcome,
but they are not useful in making better decisions. The Six Steps are
meant to be a structure for decision-making that one would apply
iteratively; in which each of the six steps would involve different data
points and variables. Of course, it is highly summarised, flavoured
with marketing. And I would say in hindsight, its not really useful as a
practical or operational tool. It’s really just a theory, a
simplification of the better documented ideas Professor Willke writes
about in his book.
And I think we can do better. In the IBM Data Governance Council we
will soon begin to explore dynamic simulation models that go far beyond
the Six Steps to Smart Governance, and I recommend reading both the white paper and Professor Willke’s book:
Smart Governance: Governing the Global Knowledge Society
Today, thanks to really powerful simulation software, we can create
dynamic models that help demonstrate the impact of policy on people,
processes, and technology. The Data Governance Simulation Project will
revolutionise the field of Data Governance by applying theory, software,
and observed practices to an interactive model that will yield powerful
insights into Data Governance Value Creation and Risk Mitigation.
A lot of people ask me, “how do I show the value of metadata?” Some
say, “how do I make the business case for Data Governance?” Consultants
and Gurus will have a framework or process to offer you, a get started
guide with use-case examples, graphics, and legends about their
successes. But these myths won’t help you, because your challenges are
unique. Your politics are special, and your people are not machines.
Best practices are useful examples of glorified solutions that are very
hard to replicate outside the lab. And as many are already finding out,
people resist policies they don’t think apply to them and its really
tricky to understand how to change organisational behaviours on an
on-going basis without policies that dynamically change with new
Data Governance is, by nature, a systemic challenge and you can’t
solve systemic problems without systemic solutions. Projects and teams
that expect quick hits and 90-results are the reason you have systemic
Data Governance problems in the first place. But it is possible to
create software models that allow you to plot the goals, metrics,
policies, communications, outcomes, variables, and modifiers and
evaluate the impact of new policies and controls on your environment.
And that’s the lesson of Smart Governance: you can model complex
environments through Simulation and make better decisions. To learn
more about using Simulations to make better decisions, take a look at
the IBM Smarter Cities Demo.
In that demo, the complex interactions of human beings living in a city
are compared to the goals of human policies, the metrics measured by
interactions, and potential outcomes.
Many of our organisations are as complex as small cities. Policy and
Politics share the same ancient Greek root word – epolis. epolis is a
city, which itself is an aggregation of human beings who require
Governance to arbitrate their diverse interests and achieve better
outcomes for all. Today, we can simulate those interactions and help
Policy makers profile the impact of their policies before they are
deployed. Its a kind of Visual Risk Calculation.
If you would like to participate in the Data Governance Simulation
project, please read the Six Steps to Smart Governance White Paper, the book
by Professor Willke, and join the IBM Data Governance Council by executing this membership agreement.
Only members of the Council will be able to participate in this
exercise and you don’t want to miss this because it will fundamentally
change Data Governance.
This morning, General Motors announced that it would no longer advertise its cars on Facebook. This announcement comes a day before the Facebook IPO, and casts a shadow on the business model of Facebook. GM said that they will continue to support their page and user community on Facebook, but that ads just weren't effective in helping consumers to make car buying decisions. Ford jumped on this announcement to say they would continue to buy ads on Facebook and that Social Media requires a consistent commitment to innovation and community development.
Maybe. But I think GM's decisions does illustrate a key problem for Facebook and Twitter - the revenue model. Social Media grew up without dependencies on ad-based revenue. On Facebook, you aren't a customer. You are a product, and its your likes, dislikes, friends, photos, videos, and content that generate value. Selling products to products via advertising is hard. Members don't use Social Media to go shopping. There's no commerce platform there. They use it to be social. There are so many other outlets that are more effective for advertising than Social Media.
So how should Facebook and Twitter make money? My idea: make it collective. The value is in the data.
1. Make terms and conditions explicit that every member owns their own data via copyright. This does two positive things.
A. It indemnifies Facebook and Twitter for the crazy, infringing, and potentially libelous posts of their members by allowing them to claim that they are conduits of content rather than publishers or distributors.
B. Copyright establishes the rights to royalties for content created and posted on their networks, which enables the next step.
2. Allow members to opt-in to Big Data analysis by Social Media partners and intermediaries.
3. Charge Social Media for Big Data Searches by data volume.
4. Pay members royalties every time their data is used in Big Data Searches.
This simple model creates powerful incentives that transform user members from products into mutual social network content providers with an economic interest in posting content that will be used in Big Data searches. It establishes data property rights that insulate Facebook and Twitter from vouching for the content on their networks. Members will also discover that providing high quality data that companies want to search for means more royalties and so the system will produce better behaviors. And it creates a 2-tier royalty distribution model that will also pay Facebook and Twitter handsome revenue that will change online advertising and make every other content aggregater change too.
Of course, Facebook and Twitter will have to sort our who's a person and who's a bot, and will have to provide content creation tutorials to help users/customers create content that has value by sharing the top 100 Big Data queries and sample results.
But this Business Model has something for everyone and is a true win:win. It benefits customers by establishing data property rights and royalties for content. It benefits organizations who want to do Big Data searches by providing ever richer data streams of high quality and availability. And it benefits Facebook, Twitter, and their investors by providing an enormous profit making engine selling Data.
The Data is the Value. The more there is, the more valuable it becomes. Pay your customers to create higher quality data and charge your partners to use it. Its a simple Business Model.
Dick Costolo - @dickc - and Mark Zuckerberg - @finkd - are you listening?
Data Governance Programs are popping up all over the globe. It isn't hard to get one started anymore. But it is hard to be good at it and to make it last. In fact, I see more programs taking one step forward and two steps back – narrowing focus to demonstrate results – to fall in line with other IT projects than chart a clear path towards larger transformation.
But lets be clear – Data Governance is about Business Transformation. We can't change organizational behavior to take data seriously if we can't change how we work.
We in the Data Governance Council have a vision that Data Governance is a coordination of people collaborating on common goals and purposes – to use data as an asset. That vision requires that piecemeal project management of data issues must evolve into systemic governance structures and methods, whose goals and purposes themselves transcend the people, applications, and interactions.
Until last year, we didn't fully know how to close the gap between where we are today and where we'd all like to go. But today we see the way forward, and the Data Governance Council is embarking on a bold new program to develop Predictive Governance: systemic ways of describing our world and modeling potential interactions to understand what works and how to improve it.
Traditional scientific analysis says that to understand a problem you have to take apart the issue and decompose it into all its components and sub-components and find the root cause.
But this assumes there is always just one root cause and one thing to blame:
“Data Quality in our branch operations is atrocious, so we have to fix our incentive structure.”
“Our network was hacked and our customer data was exposed, so fire the CISO.”
Its almost irresistible to search for scapegoats to common problems using simple cause and effect analysis.
People rarely ever imagine that
Individual data quality problems are symptomatic of larger systemic challenges in the information supply chains we have created over decades to handle information flows from source to target;
and no CEO expects that network hacks are the result of systemic weaknesses in IT systems that are themselves a reflection of organizational culture and priorities.
Its hard to accept that people created the systems that enable Poor Data Quality, Global Jurisdictional Jungles, Metadata misunderstanding, Lax Security, Privacy Invasions, and Big Data Mischief. No one deliberately creates these problems. No one wants them to continue. But they do continue nonetheless because people really don't understand the elements and interdependencies of the systems they have created.
The point of Predictive Governance is that we work in large ecosystems and we must work to understand them. If we can't describe our ecosystems, we can't rise above the superstitions and organizational behaviors that constantly hold us back.
This event will explore the ideas and methods behind Predictive Governance, new Enterprise Data Governance Solutions that integrate multiple business and IT domains, and Internet Jurisdiction and Multi-Stakeholder Governance in the context of global regulatory confusion as an archetype of Predictive Governance Challenges.
These are big problems and we are working on big solutions.
See the agenda. Read our blogs. Understand our mission. Be prepared to interact.
This is a thought leadership forum for change. Join us and make a difference.
This event is open to all who wish to join the IBM Data Governance Council. Register to attend here: http://dgcouncil.eventbrite.com/
On Tuesday, I gave a keynote presentation at SIMposium 2010 in Atlanta, Georgia. It was on the last day of a conference at 8:15am. On the best of days, I'm not a great morning person. The last day of a conference is not normally the best of days for a presentation. Normally, at least half the participants are in taxis on the way to the airport and the other half are often exhausted from the content and discussions on the earlier days. When I was first asked to speak, I was not inclined to do it. Keynote or not, 8:15 on the last day felt like a bad proposition.
I could not have been more wrong. First, the room, and it was a huge ballroom, was full with about 300 people. Second, they were awake, animated, and fantastic to talk to. We had a great conversation together, and I completely enjoyed the interaction.
Third, they were not the normal Data Governance crowd. In fact, when I asked how many had Data Governance programs at the start of my presentation not one hand went up. This is the kind of group I love talking to and they are the ones we most need to reach.
SIMposium, thank you for an excellent experience. Many have since requested my presentation and here it is in Flash format. Just click on the link below and it will launch in your browser.
SIMposium 2010: Change is Not Just a Word
I'm writing this blog entry in my hotel room on the 14th floor of the Grand Hyatt in Jakarta, Indonesia. Traffic screams by the massive fountain circle outside in a constant torrent of horns. I've been here all of two days. Met a customer in town this morning, and yesterday we drove three hours to meet a customer in Western Java. I've seen rice patties, jungle, mountains, tea plantations, small villages and ways of life unchanged for centuries, glittering shopping malls with every brand available, fantastic office towers, and levels of luxury unembarrassed by poverty in every street. It is at once fascinatingly familiar and different at every corner.
This year, I've visited customers in Jakarta, Manila, Tampa, Columbus, Johannesburg, Dallas, Hamburg, Warsaw, San Francisco, New York, Brussels, and Cologne. And every where I go I hear the same stories, the same issues, the same needs.
Data Governance is a global market. Everyone is doing it.
Tomorrow I fly to Bangkok, where Red Shirts have held a government hostage for six weeks. On the edge of a knife, a nation split Red and Yellow, and I'm hosting a Data Governance Workshop for 2 dozen customers.
The market need is hotter than Red.
If your company doesn't have a program working today, it's a competitive disadvantage.
Don't wait. Just do it.
Data=Information=Knowledge. Or so we would like to say. In theory, data is unorganized information, and knowledge is information put to use by human beings. But theories are for academics. And this theory is super convenient if semantic consistency is important. There are Data Architects who only think about data in databases, Information and Content Architects who only work with unstructured repositories, and even Knowledge Architects who I suppose work with information taken out of human brains and put into... structured or unstructured repositories on computers...
In real life, in real companies, these are artificial distinctions. Organizations want to control data/information supply chains because they are full of quality control problems, security vulnerabilities, compliance challenges, and operational exposures. Those risks imperil decision-making, increase operational costs, and reduce revenue opportunities. Quality control and risk mitigation are challenges for every data type.
Five years ago, "Data Governance" seemed like a great name for a new discipline to help transform organizational behavior from vertical to horizontal; because information is transformational. What we meant then and mean now is not just about "Data" in the purest structured sense. We mean Data in the most plural and unlimited sense. People want to govern other people's use of all kinds of information in every form.
No data stovepipes please! We need Data Governance Solutions for all human uses of information regardless of their form or structure, use or abuse.
Anyone who tells you different is just so 20th Century...
Frameworks freeze you in the past, by forcing you to interpret the present based on rigid formulas, interpretations, and even misconstructions. In 2007, the IBM Data Governance Council finished its Data Governance Maturity Model. Looking at all its imitations in the market, one could conclude that it has been remarkably successful.
However, as a benchmark of relative organizational maturity - and not just data management processes - I think its time has past and I'm working on new ideas.
Over the years, the IBM Data Governance Council has had many international meetings:
- 2005 - Kronborg Castle, Helsingoer Denmark
- 2006 - Chateau Frontenac, Quebec City, Canada
- - Bucerius Law School, Hamburg, Germany
- - Hotel de Ville, Paris, France
- 2007 - Isola di Giorgio Maggiore, Venice, Italy
- 2008 - Kuala Lumpur, Malaysia
- - Zappieon Palace, Athens, Greece
- 2009 - L'Hermitage, Franschoek, South Africa
On January 21-22, 2010, the IBM Data Governance Council will be starting a chapter in Poland by meeting in Warsaw.
Around the world, Data Governance is in hot demand.
Amazon has some Information Governance problems.
A week ago, I placed a large order of Nerf Guns that Amazon keeps refusing to process. My kids love these things and I guess some adults I know kind of like them too. We're all heading out to my sister's house in Point Reyes for Christmas this year and a combined Family Reunion. Both my sisters will be there with 7 kids in a medium-sized house for four days and the best we could all come up with to keep them occupied was felt-warfare among the tall grasses of the Inverness wetlands.
If only Amazon would cooperate.
I have no desire to carry ten Nerf weapons on trans-continental jets. I can see explaining to turgid DHS officials why a family of four needs automatic-nerf canons with heat-seeking velcro missiles. So, I prefer to order them online and let Fedex make the arms shipments discretely.
But my order is stuck in Amazon credit card limbo. It seems that the last time I bought something and shipped it to my sister instead of my home address I used a credit card which expired in May. Problem is, Amazon somehow associates that credit card with my sister's mailing address. I've deleted it in my online account, and I buy things from them all the time with the current card, but Amazon hasn't purged this relationship.
From an Information Governance perspective, what kind of problem is this? It is of course a Data Quality issue, but normal DQ tools might have a hard time with rules matching in this case. My gut is that Amazon just doesn't sweep and purge their accounts for outdated credit cards. Its pretty frustrating as a consumer, especially during these busy days. Some records management would solve that problem, but by now the point is moot for me. I just don't have the time or patience to bother fixing their sloppy Information Governance issues.
Fortunately, Walmart sells Nerf Guns too...
Since the 18th Century, Freedom of Expression has become enshrined in constitutions around the world as a Basic Human Right. It defines Democracy in its defense and Dictatorships in its assault. People like to control and don't like to be controlled, and the tension between controlling and being controlled requires this Human Right to be defended and re-defined every year. Sometimes, like during the McCarthy Era in the United States, the tide turns against Freedom. Other times, like in the Middle East today, the Freedom to speak changes the course of history.
But there is another Freedom not yet defended as a universal Human Right that should be and it is the Freedom of Information - the right to be informed, to learn. This right is implied by the Freedoms of Press and Speech, but it is not articulated explicitly as a constitutional right. Around the world, many nations have Freedom of Information Acts that require national and local governments to make information available to the public. Those acts were created when widespread access to information was rare. Libraries and archives were places where large amounts of information could be physically retrieved and governmental disclosure was paper-based. Universities and Governments were the largest aggregations of information, and they were the places you visited to get information.
But today, with the Internet, human beings have potential access to information without physical limits and it is that potential that must be enshrined in law as a basic human right. Every human being on the planet should have the right to access information freely and without threat of harm. Like Free Speech, that right should be defended even when the content of information accessed are heinous and injurious to some. Any society or nation without the Freedom of Information as a basic human right is a place that can be controlled and manipulated.
According to Human Rights Watch, there are 40 nations around the world that restrict access to the Internet or Social Networks. Many of these nations also block satellite TV and other forms of communication. But even in Western Democracies, Information Access is controlled by cost, technology barriers, labor protections, and secrecy laws. Even the most advanced nations have huge regions without access to the Internet. And some nations now seek to tax content flowing over the Internet as a means to restrict trade and favor local providers.
This is not a question of commercial competition. This is a question of human progress. Where there are people unable to access information freely there are opportunities for oppression and abuse. Democracy and Freedom will not thrive or survive without the Freedom of Information. To be ill-informed and speak freely is a condition of intellectual slavery.
I believe that we must work to assert the Freedom of Information as a basic Human Right. It must be a 21st Century Goal to connect every human being on the planet to high quality trusted information. There should be no technical, political, cultural, or economic barriers to Information.
It should be as easy as air and as cheap as water, taken for granted and governed by statute in every nation around the world.
On September 14, David Bogoslaw published an article in BusinessWeek
entitled "How Banks Should Manage Risk." Rick Bookstaber and I are
quoted in this article because we first had an interview with David
following the XBRL Risk Taxonomy Meeting I hosted at the Levin
Institute in New York on May 13, and we had follow-up interviews two
weeks ago. As is the case in any press interviews, some of what you
say gets printed and a lot doesn't. In this case, I think much of the
substance of what I told David was out of scope for the BusinessWeek
audience and the goals of his article.
terms of a banking audience, David gets it all right, and I agree with
Rick Bookstaber's comments too. But what the article omits is the fact
that from 1999 to July of 2008 the US Congress, the White House, FHA,
the SEC, and the US Federal Reserve all participated in an
industry-backed weakening of the financial regulatory framework that
was built in the 1930's. In 1999, The Financial Services Modernization
Act (named Gramm-Leach-Bliley, or GLBA for short, after its authors)
removed 70 year restrictions on bank, investment bank, and insurance
cross-ownership. At the same time, derivative market oversight was
specifically excluded from GLBA and financial markets were allowed to
create and trade complex derivative instruments without regulatory
reporting or control.
In 2001, President Bush exhorted
Americans to "go shopping" to support the US economy following 9/11 and
the Federal Reserve obliged by cutting interest rates down to 1% to
pump liquidity into the US market. In 2004, Congress lobbied Fannie
Mae and Freddie Mac to relax underwriting guidelines on home loans to
allow sub-prime borrowers to participate in "The American Dream," and
own a home, and FHA provided loans subsidies to make it easier. In
2006, Congress pressured the same GSE's to relax underwriting on Alt-A
mortgages, allowing self-employed individuals to declare their income
with a signed affadavit instead of documenting their income through tax
filings. As I've written in past blogs, that change gave license to
mortgage fraud across the country as Alt-A borrowers could make wild
income declarations without validation and that pumped tens of
thousands of fraudulent mortgages into the global financial system.
This change wasn't reversed until July 2008, when the Federal Reserve
finaly changed Alt-A underwriting guidelines. The long tail of the bad
mortgages underwritten from 2006 to 2008 mean we will suffer
significant foreclosure rates welll into 2011, extending the depth and
breadth of this recession.
2006 proved to be the top of the
Housing Market in terms of house valuations and bank fees generated
from loan securitization and derivative markup. The pile-on
legislation and market encouragement from Congress, the White House,
and the Federal Reserve came from industry pressure to keep the party
going as long as possible.
Yes, Banks took on too much risk
from 2001 to 2007. But the US Government encouraged and enabled
excessive risk taking during that period, and both need to be monitored
to prevent future crises. There is an inherent conflict of interest in expecting the government that enabled the current credit crises to participate in the forecasting and prevention of the next one.
There is a history of financial
de-regulation followed by marked innovation and crash that goes back
100 years in the US. The innovation generates enormous wealth on Wall
Street and new tax revenues for Federal, State, and Local Governments.
The relationship between government enablement and financial innovation
was omitted in David's account and needs closer scrutiny because
policy-makers, and the public, will need new information management tools to realize the
impact of incremental policy decisions on financial market performance
over the longer term to be able to regulate wisely in the future.
the article, I recommended that the government create a new Regulatory
Information Architecture, modeled on the Information Sharing Councils
created by the Bush Administration for terrorism intelligence gathering
following the 9/11 Commission Report and the Intelligence Reform and
Prevention of Terrorism Act (IRTPA) of 2004. But more is needed.
year ago, I believed that new information technology and data
collection would enable the US Government to better analyze the
performance of financial markets and forecast potential bubbles and
crisis. I'm sure that enhanced information sharing in the US
Government will enable better regulatory enforcement, but it's not
enough to prevent future crises. The public needs to play a role in
the oversight process because the Government has its own interests
which are not always perfectly aligned with those of the public.
Administrations change, and with those changes come new philosophies of
governing and regulation, and in a Democracy like ours you always want
to enable others to regard and report information that others disregard
Therefore, what's needed is more information
transparency about market holdings and the actions of market
participants so that anyone in any firm, university, or industry
watchdog can analyze nearly the same macro and micro economic data that
federal regulators observe and make their own forecasts and
Without public access to better market data, we are just enabling government to encourage risk taking more efficiently in the future.
You can read the businessweek article here: http://www.businessweek.com/print/investor/content/sep2009/pi20090914_336015.htm
Governance is a communication system for measuring complex needs, articulating a systemic response in Policy, and enforcing that policy. When I say it is a system, I mean that it is a social system abstracted from the people and psychologies that perform various Governance tasks. The people may come and go, but the system remains largely the same.
Smart Governance includes some additional dimensions that make the system evolutionary as well:
1. Dynamic methods for collecting and analyzing needs in an organization or society
2. Hierarchical, Market-based, or hybrid political models for integrating diverse points of view into the policy-making process
3. Diverse communication tools for integrating policies into a variety of business, IT, and social processes
4. Methods to measure policy outcomes, compare them to original needs, and re-define policy to meet new requirements
5. Solutions to measure systemic risks, capture mistakes and losses, and enhance organizational intelligence and Knowledge as a Shared Resource through constant systemic improvement.
The goal of the System is to meet the needs of The Customer, without regard to governing ideology, personal psychology, or vested systemic interests, as well as to continuously diagnose deficiencies in the The Smart Governance System, collect organizational knowledge, and improve over time.
Smart Governance is a challenge for human as well as IT systems. We all understand politics. Few understand Governance Theory as a Sociological System abstract from psychological and political practices. In this way, Governance Theory is a communication science more similar to computer science in its architecture and schematics.
The history of Governance is entwined with the history of governing ideology. But ideologies impose systemic order without regard to evolving customer needs or changing collective goals. In a Knowledge Society, governing system ideology is secondary to customer needs and collective goals. The constant pressure to improve outcomes in a globally competitive world, makes ideology a tool rather than a purpose of governance systems.
Hierarchical, Authoritarian, Democratic, Socialistic, and Market-based governing ideologies are all potentially useful systemic policy-making tools in different governing contexts when meeting customer and collective needs with systemic policy response. These ideologies, as communication methodologies, can be used interchangeably depending on the governing policy requirements.
In 1992, Francis Fukuyama wrote a famous book called The End of History, in which he forecast that the end of the Cold War would see Western Liberal Democracy become the predominant world governing system and that ideological struggle as a function of historical definition was dead. From a Governance Theory perspective, this thesis is hopelessly simplistic. Governing Ideology will cease to be a definition of history when companies, nation states and trans-national organizations liberate themselves from the confines of singular governing ideologies and tailor governing systemic tools (ideological communication instances) to meet ever changing policy needs of customer requirement and collective goals.
This morning, EU Regulators announced that they propose to create a Risk Board to monitor financial market performance and systemic risk indicators among the 27 member nations in the European Union. I've advocated a Council approach to risk-based decision-making since the beginning of this year and I think the EU proposal is a good idea in concept. Unfortunately, in Europe it seems decision-making takes a large number of people, becaue the European proposal would have 63 people participating on the Risk Board. A deliberative body with 63 people is not a "Board" - it is a legislature. To complicate matters, "only" 32 members of this board would have voting rights. Unfortunately, the only power they can vote on is a warning to member states that some part of their market performance contains systemic risk. How they plan to determine that threat and get everyone to agree on what it means in any reasonable amount of time is not clear. My guess is that this is a proposal to setup an intra-governmental think-tank that will study issues, write economic reports that no one reads, and only threaten to issue warnings because a vote on a warning will never happen.
Note to Obama Administration: If you want to create a Systemic Risk Regulatory Structure that is guaranteed to fail due to political indecision and lack of authority, copy the EU model.
ComplianceWeek covered the XBRL Risk Taxonomy Forum Meeting in NY last week with an excellent article enclosed here.
It is a longer article, but this is from the front page:Using XBRL to Attack Systemic Risk
By Todd Neff — April 7, 2009
Already hard at work making Security and Exchange Commission filings interactive, XBRL technology now finds itself at the heart of plans to save the U.S. financial system from future calamity.
A group of risk-management leaders in the financial industry has begun studying how XBRL might bring clarity and transparency to the murky world of financial risks, much the same way Corporate America has just begun using XBRL to bring more clarity to financial statements.
While any such system is a long way off, proponents say the technology is tailor-made to help regulators (and investors) root out hidden threats to corporate balance sheets before they, well, break the bank. XBRL could, for example, let a regulator peer through a bad debt line item and see the individual loans feeding it; that task would take hours of spreadsheet diving today.
But XBRL could also do much more. Steven Adler, director of IBM Data Governance Solutions, says the computer language provides a standard vehicle for regulators to track not only weeks-old summary data, but also financial positions accruing across many banks and market segments. That would shed more light on systemic risks—which, left unchecked, can bring financial calamity of the sort we’re witnessing today.
Any potent XBRL-based scheme to report risks, however, would require the reporting of daily financial positions, a major shift in how trading firms, hedge funds, and investment banks do business. To that end, Adler’s IBM Data Governance Council is spearheading a movement that would change how investment banks and hedge funds interact with regulators.
“At this point, everybody is aware change is coming,” Adler says. “And parties would rather be in the room together talking about common solutions.”
A speech Federal Reserve Chairman Ben Bernanke delivered last month shows him to be in agreement. Bernanke advocated taking a “macro-prudential” approach to risks that are “cross-cutting,” affecting many firms and markets or concentrating in unhealthy ways. It would involve “monitoring large or rapidly increasing exposures—such as to sub-prime mortgages—across firms and markets.”
You can read the full article here.