Welcome to my Blog. I resisted writing a blog at IBM for many years. I have a short attention span, and just couldn't conceive that I would have anything interesting to write about for more than a few sentences a month. You may still not find what I write about to be that interesting, but as I get older my attention span seems to be growing so I am going to give this new medium a chance.
I have to begin my blog about Data Governance with a Short History of Data Governance at IBM.It started in April 2004.
Customers were talking about new requirements and trends in the marketplace that involved Security, Privacy, Compliance, Data Management, Policy, Audit, and Organizational Structures all at once. I saw large banks and credit card issuers nervous about increasing rates of internet fraud and identity theft in their merchant supply chains, facing millions of potential exposures across their merchant networks.
There were brokerages concerned about data transformation for offshore application testing, insurance companies aggregating all lines of business to create common customer records requiring new forms of data protection and access control. I heard confusion regarding global regulatory demands, cross-border data flows, institutional stovepipes and growing cybercrime and identity theft. Data has always been treated like a redundant and valueless commodity, but it must have a value if governments are regulating its use and criminals are trying to steal it. It must be both an asset and a liability, though I don't think anyone in 2004 fully understood what this means.
Through the spring, I visited many customers, and asked many questions. In June, I found myself in front of a CISO on Wall Street presenting the IBM Enterprise Privacy Architecture. I had about 30 charts prepared, and we had just updated EPA to version 2.0 so I was excited to share our latest ideas. At about the second chart, the CISO stopped me and said, "this is all very nice, but personal information is just one of our data types. We need a data architecture that embeds policy into business processes for all our data types. We want to discover the existing, unwritten policies that are part of our business culture that you'll learn about if you ride the elevators here for a day, as much as we want our employees to follow the new written policies we try to deploy every day. How do we do that?"
I didn't know the answer, but I did know that the stories and vignettes I was hearing pointed at new kinds of challenges that privacy, security, or data architectures alone would not solve. I went back to IBM to ask if we had any solutions for these problems, and heard from various camps that we had parts of these items covered. Peering beneath the assurances, I still saw gaps in understanding and capability, and volunteered to host an event to bring the two sides together - customers with new problems that had no name, and IBMers with old solutions that didn't fully apply.
In July 2004, I announced an IBM Security & Privacy Leadership Forum on Data Governance to be hosted at the Mohonk Mountain House in New Paltz, NY on October 6-8. The date was ideal, just on time for autumn foliage. The agenda involved three days of interactive dialog with customers on hard issues with no easy solutions. I didn't want a marketing and sales event. I wanted a Socratic dialog, an exploration of challenges, to define the outlines of a new marketplace. I wanted to bring IBMers and customers together in the same room exploring challenges, building a new partnership with our customers, and lead IBM in an outside-in design movement to align our capabilities to real customer requirements. Mohonk is a special location.
Built in 1878, the old Catskills resort sits at the lip of lake carved out of a crater at the top of a mountain amidst 5000 acres of wilderness. There are no TV's in the rooms, bad cell phone reception, and at that time no internet connectivity. The air is sweet, the views breathtaking, and the atmosphere quickly removes you from the reality of your work environment. It was the perfect location, and 120 people attended the 3-day event, including 60 representatives from companies across the world, 20 business partners, and 40 IBMers.
The original agenda file is too large to post, so I am posting some of the presentations instead.We divided the agenda into three workshops on Policy, Content, and Infrastructure. On the first day, we invited Customers to describe their challenges in these categories. On the second and third days, we broke up into workshops and explored the issues. The Infrastructure discussions were dominated by business partners and technical issues. The challenges and solutions were the most concrete and least revelatory. The Policy discussions started well but quickly devolved into a debate about whether policies were made of rules or rules were made of policies. There was no resolution, and yet there was also no letting up in the debate. It raged for two days, and continued in email thereafter. The Content breakout took place in an attic conference room with bad acoustics. But the dialog was fantastic. We explored new data architectures, challenges with legacy document formats, storage, archiving, discovery, and reporting. The discussions were interactive and intellectual, and all the participants came home with new ideas and insights.
In the end, I think we had the wrong categories on the agenda. I had asked all the discussion moderators to bring issues and questions on their charts, and many of them did but many of them also didn't. You can't really control every aspect of a presentation, or an event. But it didn't matter, because the most valuable part of the entire event was social interaction of customers from many industries and geographies recognizing that their problems were not unique, that they were connected in a common fabric that demonstrated new market requirements that every IBMer in the room heard loud and clear. And everyone accepted that the new name for these challenges was Data Governance.
Following the event, I received many letters of thanks from customers, partners, and IBMers. We clearly connected with many, and some customers asked if we could continue the dialog. They recognized that Data Governance was a common challenge that would require much more discussion and understanding. With that invitation, I formed the IBM Data Governance Council in November of 2004 as committee of industry peers dedicated to exploring common challenges and solutions in Data Governance. Key Bank, Merrill Lynch, Danske Bank, Bell Canada, and Deutsche Bank were the first members.
By the time we had our first Council meeting at the Ritz-Carlton Amelia Island in February 2005, we had 20 members. Since then, the Council has grown to about 55 members, and we've explored many issues and achieved many milestones. We meet four times a year, normally three times in North America and once in Europe. There are some participants who have attended every meeting. The meetings last 2 days, and over the course of the last three years we have built enormous social capital and trust in the Council that has enabled us to collaborate cross organizational stovepipes, among and between competitors, in ways that set an example of good governance.
In my next blogs, I will describe the road we took, the meeting contents, deliverables, and lessons learned.[Read More]
Adler on Data Governance
On September 25th, I hosted an ISACA e-Symposium on Data Governance. ISACA reports there were close to 3000 people registered for the webinar. It was conducted live, over IP and VOIP. I gave an introductory presentation on Data Governance, and we had excellent presentations from Bank of Montreal, Key Bank, and Discover Financial.
This call represents the first time the Data Governance Maturity Model has been shared with such a large audience. Participant feedback was excellent. We had many great questions on the line, and hundreds more were sent in via email.
Enclosed are some of those questions and my answers (names withheld to protect privacy):
Question: "In your opinion which one can be more cost effective and considered best approach? A. Invest in the development of a Data Governance Programme as a separate entity. OR B. Leverage existing asset management processes, such as ITIL and ISO27001 to accomodate Data Governance?"
Answer: Data Governance is more than asset management, and one the key problems we've been trying to solve is more political than technical - it's how to get many different people from different disciplines to work together and solve complex problems. Some of these problems involve managing data as an asset, but some also involve managing the risks to the assets, being rigorous about policy definition, developing stewardship programs, storage and discovery rules, metadata and compliance.
I would always tell my customers that Data Governance is a new way of thinking about old problems and strategically should be integrated alongside existing models, like ITIL, that already work and are understood. But it isn't one or the other. It's both.
Question: "How do you impress top management with the importance of Data Governance?"
Answer: Data Governance is as much a political challenge as it is technical. Two things are going to get your program off the ground fast - acknowledged deficiencies across the organization that management can easily understand and quantify, or a determined sales program on your part to sell Data Governance as a solution that can solve many individual problems. In either case, start with executive interviews. The Data Governance Council Maturity Model can provide a framework for the interviews by segmenting issues and suggesting natural assessment questions. Whether you use the Maturity Model or develop your own questions, you have to do the legwork to discover their needs, classify the opportunities, and develop an internal sales strategy to rally your support.
Question: "data or information is intangible. Is there any specific model or method to quantify the value of data and information?"
Answer: The value of anything is determined by its price. There are many different ways of setting price. Markets set prices for stocks, bonds, and other financial instruments. Manufacturers set prices for cars, refrigerators, etc. Countries set prices on taxes, trade duties, and government services. IT can set prices on data, which is not really intangible. You pay for it all the time, when you buy a newspaper, rent a DVD, purchase software, why even your monthly broadband bill is a contract for data services. What we have yet to accomplish in internal IT is more specific mechanisms to establish the value of data based on its utility, demand, and ultimately price end-users will pay for it.
Question: "You mentioned that 10 companies have adopted the Maturity Model in some form. Can you identify any of them and speak to the results they've experienced?"
Answer: No, I can't. Those companies will identify themselves when they are ready to go public. But I do hope that the Maturity Model, through venues such as ISACA, can inspire a broader public discussion about Data Governance and successful implementations.
Question: "Love the notion of allocating costs for Content Level Agreements & Alternative Risk Transfer Agreements! What is this seen as a separate focus rather than being totally integrated with IT Governance (e.g., may be seen as extending some CobiT control objectives)"
Answer: As I said in my presentation, IT today is run like a Command Economy, with projects centrally funded and managed and no real economic tools to modify user behavior regarding perceived value of IT or need to mitigate risk. Internal funding agreements like Content Level and Alternative Risk Transfer are new economic policy alternatives that business can use to price and sell data internally based on the business demand for quality, availability, integrity, and security, as well as "tax" business units for the losses they create. I hope businesses will begin to leverage economic tools like these to turn the IT department into a P&L Center, and represent the aggregate internal IT rates of return in the financial balance sheet.
Question: It is considered best practice to hold end users or local managers responsible for data accuracy - is data governance an attempt to centralise this concept?
Answer: I think Data Governance is ultimately successful when it pushes organizational responsibility and policy obligations out from the center to every employee. Look, we buy gas, pizza, clothing, and other consumer goods every day and we don't need to consult with Congress or carry law books with us everywhere we go to conduct those transactions in a lawful way. We have, as a society, learned to conduct business in lawful ways that are for the most part free of vice, corruption, and crime. We call this civilization, the rule of law, etc, and these are examples of self-governance. We need to ultimately achieve the same degree of self-governance in our own organizations, employees who all understand their obligations to govern their use of data appropriately.
Are we there yet today? Not in all cases, and we still need central institutions to create policy and push compliance out to the organization. But this is the goal we should strive for - delegated responsibility and accountability.
Question: "have you defined "standard" quantitative measures to assess data governance maturity or data quality?"
Answer: The Data Governance Maturity Model does define five levels of DG Maturity, and insofar as those levels can be seen as quantitative the answer is yes. In the real world, it's not so simple. Maturity is relative to peers in an industry, and what is today to be considered a mature state at say level 2 might tomorrow be considered immature. Ultimately, it is for every company on their own to determine what the levels mean to them and what goals they need to set to achieve their maturity. We'd discussed this many times in the Data Governance Council, especially on the topics of what is mature and how many categories in the Model should everyone use. In the end, we decided we should let the market decide and the best thing we can do is collect implementation examples and share them with other practitioners to allow everyone to pick and choose the categories and levels that best meet their needs and culture.
Question: "Not really a question... It would seem that the processes that transform data into information (or information into organizational knowledge) must also fall under the control of general data governance, since it is possible to take perfectly sound data and transform it into bad information."
Answer: I get this question often. Many people think that data is in a database, and when human beings use it it becomes Information. I personally think this is applying industrial assembly line metaphors of production to information. In some ways this is a rather vain metaphor, because we humans like to think we improve on data when we transform it in our brains into information. We are better after all than mere machines. But of course, humans also degrade information, on a regular basis, when we use it. So data can also become pollution when put into human production. We have enormous stockpiles of data pollution throughout the internet. :-)
In the end, I don't think these distinctions add much value to the challenge of Governance.
Data=Information. These are synonymous terms from a policy perspective, because ultimately the data/information has to be stored someplace. And the policies we write are intended to govern how human beings, and computers as their tools, control that data/information where-ever it is stored - in a database, on a web page, in a spreadsheet, in a video, on a printed page, or retained in a person's memory. Policy should apply, and stewards should enforce, regardless of storage medium, and what we should be more concerned about is metadata to describe more distinct attributes of data/information, like its quality, integrity, reliability, business uses, past modifications, etc. With these tools we can better apply Policy to data wherever it resides, however it has been improved or degraded by humans and machines.
Question: " You have indicated that there are two avenues to pursue to obtain compliance, reward vs punishment. Which process have you found most effective or a combination of both for global enterprise?"
Answer: I don't think I called them reward vs. punishment. I think I said that an governing power has a few fundamental policy instruments - to make things cheaper, legal, or easier to do, or to make them more expensive, illegal, and painful to do. Both levers have pros and cons. And both have different effects on human behavior given different circumstances. I don't advocate one vs another. Human beings have to choose their policy tools and how each best fits their policy goals. Like our own Congress, trial, error, and evolutionary improvement are still the only model we can deploy to guide policy. In the future, however, I do hope we can develop better technology tools to help policy makers analyze different variables, model potential outcomes, and determine the best policy mechanisms for each challenge, and measure results based on forecasts.
Question: "Is data classification accross the organization a key element for Data Governance?"
Answer: Emphatically YES! Most Data Classification is a blunt Security-based tool. We call data Top Secret, Secret, Classified, Public, etc, never indicating much about its business uses, quality, integrity, storage location, etc. We need business glossaries to understand business definitions and we need to link these to technical metadata to enable policymakers to search for policies, data assets, and exposures across our enterprise like we today search for news, ideas, and communication on internet search engines. We need a broader view of metadata and Data Classification and while business people may never fully understand this area of IT, we need to develop better tools to enable them to use it without having to understand it.
Question: "You cited USA laws and regulations. What about leveraging on different areas (Europe, Asia) where you have different ones for multinational public companies ? Besides what about financial risks, like different currencies and related fluctuation in outsourcing, offshore, etc.?"
Answer: You are right, all these are equally important issues. We are probably just mid-way through an IT regulatory cycle that began seriously with the EU Data Protection Act of 1996 and the HIPAA Act of 1998. SOX, Basel II, PCI, SB1386, and so many more regulations are changing the nature of IT development and deployment. Just as at the dawn of the 20th Century, when governments around the world passed industrial regulation, so too today are our countries grappling with the best way to regulate the impact of IT on our societies. I do wish that countries would make technology a cabinet level policy position, because we need better IT advice in public policy-making.
Question: "what measure is put in place to encourage data governance and privacy law compliance in africa?"
Answer: "Good question. I don't know. But I will look into that and write about it on a later blog."
Question: "Could you talk more about selling this approach to clients? What method do you use to persuade them not only to the general concepts, but also to really invest in going down this path?"
Answer: Most of the clients we deal with are already sold on the need for Data Governance. Three years ago, when we started the Data Governance Council, the numbers of believers were very small. That's why we organized the Data Governance Council - to gather together the innovators and early adopters and build a community that could learn from each other and synthesize that knowledge into methods the broader marketplace could adopt. The Data Governance Maturity Model is the product of this process, and I would encourage every company interested in Data Governance to explore it's potential. While I can't publish its contents here, I will tell you that it is extremely detailed - 11 categories, with many sub-categories, all with 5 levels of maturity. It is an excellent tool to model a Data Governance program and benchmark internal practices against levels of maturity created by industry peers.
Question: "Could you provide a link to Data Governance Counsel?"
Question: "Any example companies that have implemented an ART approach to charging user departments for risky IT behavior and how has that gone?"
Answer: Any large bank complying with Basel II and using the Advanced method of operational risk calculation already have the methods in place to create an internal market for Alternative Risk Transfer. They could even, potentially, setup their own Self-Insured Retention to "pay" out internal losses based on the "premiums" collected from their organizational stakeholders. In reality, every company already self-insures against their own IT and operational losses. The problem is often that these losses are not recorded in a systemic way, the information is not analyzed to detect loss patterns, and few organizations have the actuarial mechanisms to leverage their loss data to forecast future exposures. But all of this is business as usual for any large E&O insurer or Basel II conforming bank.
Question: "How to justify penalizing business for data incidents as the common perception is that IT department is responsible for taking care of data?"
Answer: Anyone carrying a Blackberry with customer data should be responsible for taking care of data. Data doesn't just live on a green-screen connected to a massive mainframe any longer. It's mobile, its everywhere. And every employee is creating and exposing it to value and harm. That's why Governance is a group activity involving stakeholders from IT and Business. Everyone is responsible and therefore you need everyone involved.
Question: "How is the model accessible? Is it possible to buy/download it somehow?"
Answer: Not yet. We'll have to look into that.
Question: "Which organizational model is best suited for Data Governance?"
Answer: In the short run, it's the one that best fits your organizational culture. In the long run, in globally integrated enterprises with employees in every timezone, working from home or on the road, I think we will need more distributed organizational models and I look forward to inventing that next.
Question: "Would COBIT be an appropriate reference to implement data governance in terms of how to?"
Answer: COBIT would be an excellent reference if that's what your company is already using. We have so many alphabet standards today that don't talk to each other. When you implement Data Governance in your company, try to bridge reference standards as you also try to bridge organizational stovepipes. They have the same effects to divide and separate people and what you need is to bring people together.
Question: "With regart to using ART, how do you avoid the pitfalls of departments getting into "fingerpointing" arguments with one another where more resources are spent on blamimg each other for the cuase of the data integrity/quality issue rather than actually addressing the root cause."
Answer: Let each department determine it's own root causes for loss. What you care about is the levying the financial premium for the loss. The payment itself is an incentive to fix the problem.
Question: "Have you distinguished the difference between data and information in the studies you have conducted? Data becomes information when it is synthesized or crunched in a system and then reported as information. ....Data in...Information Out... Where is the starting point of governing data and when do other IT governance models take over? When data becomes information? Thank you."
Answer: We've discussed this distinction many times in the Data Governance Council and we've always agreed that Data and Information are synonymous. The way you phrased your question, however, makes me realize that you are applying an industrial production metaphor to data/information usage. It's like raw materials entering an assembly line with finished product popping out the back.
But people and IT systems don't use data/information in this way. If you take a data element out of a database, crunch it in a spreadsheet, send it to colleagues for interpretation, and turn it into a powerpoint, this "information" is still data stored in a spreadsheet cell, email, or presentation chart. It is structured or unstructured information. From an asset and liability perspective, the values may change, and therefore we may qualify the asset with new metadata, but the way we can write policies to govern human usage of stored data/information is more sensitive to storage content and usage context than front-end description of it as data or information.
So, my personal view is that the data vs information distinction doesn't add any value to the challenge of governance. It's what is in the container and the intent of the user that are more important to Data Governance.
Question: "With content level agreements does data confidentiality have any role with the objectives?"
Answer: Yes. If the sensitivity of data has a higher business utility then an end-user is likely to pay more for it. The extra premium for the higher sensitivity would pay for the additional security needed to protect the data in the agreement. This is how you can get end-users to pay for, and appreciate fully, the value of data and security.
Question: "In point 2 How do we acess our situation. In benchmarking how can governance take decisions in a flunctuating legal environment, since an organization is affected by the global regulatory environment?"
Answer: Assessments are a snapshot of your organization in time. Don't let the snapshots get old and faded. Make self-assessment a normal part of every new business process, and re-assess yourself on macro topics on a regular basis. In this way, you can stay on top of ever changing global business regulations and requirements.
Question: "Do you see Data Governance as a process that creates a burden on existing resources, or an investment in the future? This may sound like a silly question, but a lot of organisations are reluctant to change and see Data Governance as an additional cost on people's time."
Answer: Every organization is already Governing Data:
A. They don't know it.B. They are doing it badly.
The burden is already there, uncounted. Count up how much its costing not knowing how to Govern Data effectively and you will make your business case for change. Bring the change on slowly, and integrate it into governance models already under way and you will achieve a higher comfort factor with your changes.
Question: "Is there a checklist available for DG self assessment to identify gaps and also for implementing them?"
Answer: The Data Governance Maturity Model provides that kind of self-assessment checklist and it is available to members of the Data Governance Council. Information is available on this website on how to become a Council members: www.ibm.com/itsolutions/datagovernance.
Question: "Sorry, but I missed the explanation for the contituents of the members at the data Governance Council and who's sponsoring it?"
Answer: IBM sponsors and runs the Data Governance Council, and membership information can be found on the website posted above.
Question: "Does this governance structure and process require full time staff to implement, monitor, and measure success? If so, how many FTE's would recommended for an organiztion of 10,000 employees?"
Answer: Yes. Many organizations today are investing in various Stewardship programs to provide full time staff to implement governing policies and monitor results. These are your organizational doers and while they can be part-time in early DG pilots to get your program off the ground, a DG program will require full time Stewards to be effective. Start small and grow fast. Let your stewardship number be proportional to the value they can create. Measure that value through data quality, process efficiency, and risk mitigation metrics. Report it often.
Question: "Please explain Value Creation with reference to data governance?"
Answer: Value Creation is several things:
A. A measure of the value created through the use and enhancement of data to your business bottom line.B. The yardstick of performance of your Data Governance program overall
We are Governing Data to create more value and we want to measure and report it on a frequent basis. I hear many people get caught up in complex qualitative measures of data, metadata, and value. Keep it simple. Measure productivity, labor saved, more efficient business processes, higher customer satisfaction, increases in revenue, reduced risk. These things are quantitatively measurable. If you don't know how to measure them, ask your CFO for guidance.
Question: "is Data Govenance in any way conected to corporate,enterprise and IT governance?"
Answer: Yes, Data and IT Governance describe new forms of corporate governance below the board level. All these governing bodies should use similar policy processes, have the same kinds of roles and responsibilities, and have well defined agenda and reporting rules with common charters that contain similar language. What you want is a system of governance in which the people may change but the powers remain the same. If you are creating governing structures that all have different charters, roles, and structures, you are creating complexity and your governance programs will fail.
Question: "IT Governance vs. Data Governance ... do you all consider this same thing?"
Answer: Same answer as per above. I don't consider them the same thing, but I do consider them different parts of a similar problem. IT is no longer a back-office function with no front-office dependencies. In many companies, IT is the front display window, the main method for interacting with customers, the brand a customer sees when they first contact the organization. Governing the human use of IT assets has become a central challenge in many organizations, and IT and Data Governance are different approaches to common challenges.
I would always tell my customers that Data Governance is a new way of thinking about old problems and strategically should be integrated alongside existing models, like ITIL, that already work and are understood. But it isn't one or the other. It's both.
Question: "Can you recommend tools that may be available in the market for data governance assessments / maturity modeling?"
Answer: IBM provides data governance consulting and assessment services and a wide range of software tools.
More information can be found here: http://www-306.ibm.com/software/tivoli/governance/servicemanagement/data-governance.html
I'm not normally a reader of USAToday. The colors and quick articles always feel superficial and empty to me. But I was caught in a hotel restaurant this evening, on the last night of a long road-trip, and the only intellectual distraction was the pastel paper left in my hotel room. Some normal blather on the front page, and most of the paper was forgettable. But on the last page, there was an opinion piece by Alan Webber which perfectly described my own fears and experiences travelling abroad, looking back at America through the eyes of our long despairing friends in Europe. I commend this article to anyone who cares deeply about how far we have drifted from what we once were:
I flew to Las Vegas on Sunday evening. It had been a few weeks since I had last boarded an airplane and I was excited to go anywhere, even Las Vegas. The flight out was great, but the airport was like Shinzuku Station at 7am when I landed at 11pm. One-legged trolls must be hand-carrying the bags from the planes to baggage claim because it took a nervous and smokey hour to reclaim my luggage. Welcome to Las Vegas.
I was there to participate in the IBM Information on Demand Conference. This event was so good I completely forgot how much I despise that sunny city. I really don't know why anyone needs to have a gambling mecca in a city with nice weather because the hotels make it practically impossible to even see the sun on most days.
That aside, IOD was terrific. The sessions were excellent. I spoke in so many different sessions on Data Governance that I barely had time to hear other speakers. My sessions were packed with passionate participants. I took home fist-fulls of business cards and was deeply impressed with the knowledge and interest of my audiences. There were some sure who weren't doing Data Governance yet, but they were the minority and the questions and interest demonstrated to me that Data Governance is now a market that is aware of itself. Things are happening independent of IBM. The demand is there across industries. It is our opportunity to lose; customers and business partners are keenly interested in joining the Data Governance Council and learning about the Maturity Model.
For me, the event was a vindication of all the hard work the Council has put in the past three years. My congratulations to the entire IBM IOD team who put on this excellent event.[Read More]
On November 2nd, I attended a Law School advisory board meeting in Koblenz. The chair of our board is a senior executive of UBS Germany and another board member is the head of M&A for Deutsche Bank. While the topic of international banking was quite far from our subject, we quickly spilled over into discussions about risk, globalization, loss and reputation. The sub-prime credit squeeze is affecting financial institutions worldwide and what I discovered is that the frustration in Europe on this topic runs high, as many feel hostage to economic policy made far from these shores. Economic interdependence is not new. Every day, stock markets rise and fall based on market sentiments in New York, Hong Kong, Frankfurt, London, and Shanghai. What makes this crisis different is that the debt purchased was well rated by US rating agencies and eagerly acquired by banks across the globe. Today, those debt instruments have few buyers, creating hordes of worthless debt that is being written off balance sheets across the globe, resulting in one of the first global financial crises of the 21st Century.
The recent Fed rate cut seemed to underscore the limits of unilateral action in a global economy. As capital moves with ease across borders so too will, in my opinion, economic policy need to demonstrate more fluid international action. Of course central banks some of these relationships regarding crisis management, but economic policy has a fiscal dimension and G8 summits are too irregular for ongoing stewardship. Consistent economic growth on a global level may require new international governance structures combining central banks and governments so that all levers of policy can be adjusted to better balance money supply, regulatory controls, tax and subsidy without compromising national competition.
Such ideas may threaten isolationist neocons, and may also take more crises to instigate, but we can't say we have a global economy if our policy tools are mostly national. Financial disruptions as witnessed in the past months will continue just as national liquidity and credit crises caused runs and convulsions at the dawn of the 20th century. Those crises resulting in the formation of the US Federal Reserve Bank, a central governance mechanism to control the money supply as a means to steward national economic stability.
We are just beginning to understand the role of Governance in world affairs.[Read More]
This week, I attended the Global Forum 2007 in Venice, Italy. I am member of the Global Forum Steering Committee, and through the Data Governance Council IBM was a sponsor of this year's event. The meeting took place on a private Island off the Grand Canal in Venice. The Island had been home to a monastery, which is now used by the Giorgio Cini Institute for Music and Art. The meeting rooms were spectacular, with scores of Veronese paintings adorning the walls, columned cloisters, and magnificent paneled rooms. Participants included the Mayor of Venice, the former Prime Minister of France, the Chancellor of Geneva, commissioners from the FTC and FCC, presidents of Universities, and 5 members of the IBM Data Governance Council...
This was my third Global Forum event. I chaired a panel on Data Governance, and gave a brief presentation on global competition, innovation, and governance. Both were extremely well received, with many commenting that our panel on Data Governance was the most substantive and interesting of the conference. I owe a special thanks to Ed Keck, Richard Livesly, Cengiz Barlas, Paul Welti, and Jacques Bus for their fantastic presentations. On the evening of the first night of the event we had a private chamber concert at the Venice Opera House, a beautiful gilded Colosseum. Following the concert (Hayden and Tchaikovsky), there was a dinner in a private dinning room with a pianist. It was lovely and inspiring. I dined with the very charming CIO of San Francisco and the Deputy Mayor of Paris.
At the end of each year's Global Forum I am reminded of how very complex and difficult such a conference is to organize. It is not just the fantastic venues or beautiful entertainment and dinner. Most importantly it is the arrangement of all the various interests and specialties that such a global network brings together in one spot for two days. This is no small feat, and I have learned from the event's very special host, Sylvianne Toporkoff, that networking is an art and she is the absolute master.
This year's Global Forum was a triumph. It reached across the international divide and brought more leaders from Asia and the US, business and government, than at any past event I have attended. But what I also saw this year was more transatlantic tension, misunderstanding, and competition than before. For this German/Danish-American, those are troubling trends indeed. They simmer below the surface, and come out in subtle phrases and indirect cuts. But they are there and they threaten many things we all believe in. Next year I hope the Global Forum will take these issues on thematically in because it is only through direct discussion that substantive understanding can be reached. Truth is we need more global forums, as the world is growing ever more competitive.
DataGovernor 120000GKJR 1,671 Views
Everywhere I go, I meet companies building Data Governance Boards, assessing their situations, creating strategies, and looking for solutions. Data Governance today is the most prevalent form of Governance below the Board of Directors, and the only form that often brings together IT and Business leaders in one continuous dialog. Three years ago, when we first adopted the title "Data Governance" to describe the pending convergence of Risk Management, Security, Data Quality, ILM, and Compliance, no one understood what the two words put together meant. I often heard "Data What?" in response to my presentations at conferences. Today, however, there is no question that the conjunction of "Data" and "Governance" defines an exciting new marketplace of common challenges, new ideas, and exciting solution opportunities.
But the future of Data Governance depends more on the vitality of the political institutions now being formed. Data is the easy part, it is Governing that is hard. In the coming years, many solution architects and IT consultants will focus on quick Data solutions, and most Governing Boards will flounder without technology support that creates an institutional framework for Governance.
We in the IT industry will focus 80% of our solution sales on tools to govern data, but it is data to help humans govern that is a far more pressing need. In every industry (especially public sector) we need better Governance solutions that help human beings analyze massive amounts of operational information, assess the quality and value of information and use Risk Calculation to forecast options and make decisions.
What are needed are new solutions to help organizations, large and small, govern more efficiently. Our businesses and our governments need these tools, and technology has an important role to play in helping organizations transform from industrial to information models of production and value creation.
Those solutions should help democratize the governance process, create new levels of organizational transparency, help forecast and model potential outcomes, capture and communicate key policy decisions and compare them to results.
In April 2007, the International Monetary Fund revised its Fiscal Transparency Code of Conduct as a set of recommended practices for governments around the world. The four pillars of the Code are:
- clarity of roles and responsibilities, - open budget processes, - public availability of information and - assurances of integrity and data quality
While the Code was written for governments, Fiscal Transparency has many market benefits to businesses too. Companies building Data Governance Boards might want to review this Code of Practice as guidance for constructing functional and operational principles in their Charter. It shouldn't be lifted literally, but there are many good ideas here that can be applied.
DataGovernor 120000GKJR 1,626 Views
Recently a colleague sent me a very interesting article written by Wim Van Grembergen of the IT Governance Institute, entitled "The Balanced Scorecard and IT Governance." You can find this paper in PDF format here: http://studies.hec.fr/object/SEC/file/A/WPNAQMCMKBJNBLEYHCGTKWNNXVBNNMFG/balscorecard&IT%20governance.pdf
I recommend reading it because it does provide a high-level introduction to the topic of Governance from a very IT perspective. What follows is my own critique of the thesis of this paper and why I think it points IT in the wrong direction.
Overall, I think this is an interesting paper with some inaccurate interpretations. It is more of a Balanced Survey than a Balanced Scorecard and it assumes a very hierarchical, industrial, organizational structure in Governance that I think is at odds with the way organizations really function in the post-industrial Information Economy.
In the industrial model, the flashlight of corporate information shines up. Those closest to the bulb are blinded and only those at the top can see the light at its widest aperture. That is a model for control, not innovation. Control was incredibly important in the industrial age because production cycles were long and stability of resource and labor supply was critical. That was an age of Monopoly and Economy of Scale.
Today, we live in an age of intensifying global competition, shrinking product lifecycles, very low barriers to market entry, and enormous complexity in our financial, consumer, and internal markets. Governance has emerged as an organizing force below the Boardroom because every employee, customer, business partner, and associate is a potential source of innovation and innovation thrives when the light of information is spread evenly across an organization and everyone can appreciate its radiance.
Net: We are experiencing the beginnings of great changes in the modern corporation, and Governance (Data, IT, SOA, Whatever) below the board is an early manifestation of this emerging trend. Good that others are recognizing and writing views on Goverance as a legitimate management discipline, but the approach described in Van Grembergen's paper is rooted in industrial models of organization that are already giving way as corporations and nations adjust to the Information Age that is already upon us.[Read More]
I gave a speech at the Telestrategy's ISS World 2007 Conference (http://www.telestrategies.com/ISS_WASH/index.htm) in Alexandria, VA yesterday. The conference topic was Data Fusion - the use of data mining technologies for law enforcement and anti-terrorism. I spoke at a similar conference two years ago and was looking forward to meeting with that group again, but this crowd was far different and the industry has matured rapidly. Two years ago, I confronted a room of 800 regional law enforcement officials from 48 data fusion centers in the US. And I was on the agenda presenting Data Privacy. It was awkward at first, but after a few minutes we had fun together talking about the push and pull of government data mining and the protection of privacy and civil liberties. It was a group of concerned citizens trying to harness new technologies to make law enforcement more efficient but each also had their own individual concerns about how their work might endanger US privacy rights. So we found common ground and both this presenter and the audience learned something from the exchange.
This year, I confronted a small conference room with 45 people from military contractors, DIA, CIA, DHS, and a bunch of Israelis who were pretty reticent about what they did for a living. My topic was Data Governance, and aside from some technical questions from a guy working for Ratheon, no one else in the room seemed terribly interested.
In the expo hall, I discovered why. Congress and Washington's privacy elite might think that debating warrant-less wiretaps and FISA Court obfuscations are vital to preserving data privacy, but what I saw in the Expo room in Alexandria persuaded me that discussion is public posturing at best, or a charade for the ignorant at worst. The privacy cat is out of the bag, and the data fusion industry has found many market-oriented, privatized, and convenient workarounds to do what they think needs getting done with very little judicial, congressional, or constitutional oversight.
Case in point: I met a company called Spectronic. They make a communication interception technology that uses cell phone triangulation technology first developed by carriers for mobile 911 service. They sell it to public and private security services for communication monitoring during events. I can appreciate the utility of this technology.
The impetus for developing it lies with the fact that working with courts and telcom carriers is a tad slow and inconvenient where criminals and terrorists are concerned. So instead of relying on our beloved privacy preserving telcom carriers to provide triangulation and tracking of suspect cell phones at an event, law enforcement can entirely avoid that unpleasant process. They can just purchase a few of these oven-shaped boxes from Spectronic and deploy them along the perimeter of any event and instantly watch everyone's cell phone voice, data, email, location, etc, dragnet style.
Very efficient, and this was just one booth in a hall full of spy toys and spies...
Privacy Professionals take note. The rapacious marketplace, burgeoning Homeland Security budgets, and the privatization of government, are making our efforts vainglorious at best.
As I left the Spectronic booth the very nice sales rep shrugged her shoulders and told me "it's the world we live in..."
I replied "no, it's the world we create."[Read More]
DataGovernor 120000GKJR 2,008 Views
On September 18, 2007 the US Federal Reserve cut the Federal Funds rate by half a percent in response to the looming sub-prime loan scandal. The markets had lost confidence and Banks were holding debt they could not sell. Write offs ensued, and the market forecast looked questionable at best.
At the time, this rate cut was seen as a dramatic response to worsening market conditions and proof that the Fed would act aggressively to protect the economy from the housing bubble. In the next two months, the Fed intervened again to cut rates .25% in October and .25% again in December. Each rate cut was seen as a prudent response to market conditions.
In January 2008, just a few weeks after the last rate cut, the Fed had to intervene again with a very sudden 1.25% cumulative rate cut to stem an asian-driven equity market sell-off following more sub-prime write offs and loss disclosures. In just five months, the Federal Reserve had to intervene five times with a combined interest rate cut of 2.25% following 17 quarter point rate increases in as many months.
This was an incredible see-saw of macro-economic policy - gradual rate increases were followed immediately by sudden rate cuts. In hindsight, the half-point cut in September 2007 was not very dramatic in comparison to the 1.75% in cuts that followed in the next four months. No one then could have foreseen the volatility in the markets that was to come, or could they have?
Why is it that the US Federal Reserve rate policy was reactive to market volatility? Why didn't their monetary policy, which had run up rates from 1% in June 2003 to 5.25% in June 2006, anticipate the looming housing bubble and bank losses that would surely ensue? Hadn't Alan Greenspan warned of this outcome in 2005? Didn't we all know the housing joyride would end at some point?
Today, we can see banking and financial market data that shows the risk trends in our rear view mirror. Unfortunately, no one has a mirror that forecasts the future, but they could if capitalized risk data were collected on a systemic basis by banks and shared with the Federal Reserve. The Federal Reserve does an excellent job of studying catastrophic risks and running sophisticated macroeconomic loss models on everything from terrorist attacks to coastal hurricanes. The Fed uses this catastrophic loss data to provide capitalize insurance loss reserves for the US economy - ie, they print more money when very bad things happen.
The insurance reserves got tapped after 9/11 and hurricane Katrina, when the Fed injected huge amounts of liquidity into the economy to stabilize markets and restore confidence. Of course, the timing of catastrophic events can't be forecasted, but the monetary response can be estimated based on a variety of risk factors. the fed constantly analyzes and wargames these risk factors and the success of Fed liquidity and monetary responses to 9/11 and Katrina attest to the diligence of their planning and the value of risk-based forecasting models.
What does this have to do with the sub-prime loan meltdown you ask? Well, if the Fed had non-catastrophic risk-data forecasting models they could possibly pre-empt loss events with macroeconomic policy tools that could even out some of the worst aspects of the business cycle. Unfortunately, that kind of non-catastrophic risk-data has to come from banks, who until recently were totally incapable of providing that kind of data, let alone using it themselves for their own risk-based policy-making.
That's changing. In the last two years banks around the world have been working to assess and collateralize market, credit, and operational risks as part of the Basel II compliance process. That data isn't normalized across banks, and there are wide disparities in how risks are assessed, calculated, and capitalized from bank to bank, country to country. But the raw data, and the beginnings of the knowhow are, for the first time in history, there. And that data and knowhow can be leveraged to provide new macroeconomic tools for Central Bank policymakers around the world.
What's needed are standards in risk assessment, classification, calculation, and the reporting of capitalized risk data from US banks to the Federal Reserve. This may take some years yet to accomplish but the time is right to begin discussing these issues. As US Banks reach Basel II compliance they will be in a position to leverage risk-data for their own self-insurance against non-catastrophic losses, and if they would be willing to share their capitalize risk data they could help the Federal Reserve to reduce market volatility and improve macroeconomic performance for everyone.
Here's a case where regulatory compliance really can improve business performance.[Read More]
On February 27-29, I hosted the 15th meeting of the Data Governance Council at the Wales Hotel, in New York City. 31 people registered to attend this meeting, including 16 IBMers, and representatives from JPMC, Bank of Tokyo/Mitsubishi, Bank of Montreal, Key Bank, State Street, MasterCard, and American Express, OpenPages, Axentis, Varonis, and Vericept.
On the first day, we had excellent keynote presentations from Garrick Utley, President of the Levin Institute, and Will Pelgrin, Director of the NYS Cybercrime Taskforce. We also had some good roundtable discussions on common challenges in Data Governance related to Sub-prime, Basel II, and other issues. On the second day, we continued discussing common challenges and reviewed IBM Data Governance Solutions with regards to Policy and Process Management, Data Modeling and Development, MDM, Metadata and Data Quality Management. On the last day, we left the agenda and had a long discussion on the future of the Council. Cal Braunstein rounded out the event with an excellent closing keynote on the risks to and from Data, and the risks to organizations from data we can't trust.
We spent a lot of time talking about Globalization and it's effects on competition, regulation, cybercrime, and risk. Globalization is having a corrosive effect on trust in many organizations. Pressure from regulations requiring oversight and reporting of employee use of IT increases distrust at all levels. Cybercrime and the increasing financial value of data challenges everyone with offers and scams that make it hard to trust information. These factors are creating internal crises in trust and confidence. The manipulation and monitoring of information technology by people over other people threatens the quality and value of decision-making at a time when global competition brutally punishes bad decisions.
The Globalization of threats, risk, regulation, and competition will immediately force organizational decision-making inward, towards hierarchical models of decision-making, even as the globalization of markets, labor and resource allocation forces more horizontal changes in culture, lifestyle, and freedom.
This Council has existed for three years, and many members, by virtue of their participation, have achieved more mature levels of Data Governance. They have cross organizational governance models, better transparency and better decision-making. Many newer members are just now exploring organizational models, business vs IT participation, the nature of Stewardship and the complexities of overcoming organizational stovepipes.
Enclosed are my notes and observations from this landmark meeting:
1. Data Governance Market Maturity: Data Governance as a market is maturing from the Innovator phase, where a few leading companies worked together to blaze a trail for others to follow, to the early adopter phase. We are clearly seeing some leading companies succeed with Data Governance, thanks in part to the Data Governance Maturity Model, and many many more now coming into this market looking to build on the success and experience of the innovators.
For those of us pioneers, this is a time of change, and we must adapt to a new market constituency requiring education and solutions with somewhat less tolerance for discovery and invention. The Data Governance Starter's Guide should be updated as an educational onboarding tutorial for new companies seeking Data Governance success. For vendors, this is a time to study solution packaging and focus on the support needs of the stewardship community. Stewardship is a profession still in its infancy, and it requires practitioner tools, education, and community forums to exchange practices and success stories.
We should all be proud that our contributions have move the market to this new phase, and the Council needs to change to grow with the Market.
2. IBM Data Governance Solutions: IBM has come a long way in its Data Governance Solution capabilities since 2006, which was the last time we had a major showcase of technologies on the Council Agenda. Most of our solutions - Compliance Warehouse, Integrated Data Management, MDM and Industry Models, Data Quality and Metadata tools - were very well received. But this Council has succeeded exactly because it is not a normal IBM Customer Advisory Board, where normal meetings are dominated by IBM solution exhibitions. Rather, it has succeeded as a unique forum for practitioner exchanges, and it must remain this way to continue.
Future meetings will be shorter, practitioner driven, and IBM will find additional venues to present Data Governance solutions.
3. Globalization: At Mohonk in 2004, at the inaugural Data Governance Summit, I presented some ideas about how information technology would transform the modern corporation, and how integral Data Governance would be to that process. I was heavily influenced by Tom Malone and his book the Future of Work, and also by the history of industrial regulation at the dawn of the 20th Century.
In NY, we re-examined some of these topics through presentations from Garrick Utley, Will Pelgrin, and Cal Braunstein, and I think we need to continue examining how the global pressures on information technology, regulation, cybercrime, risk, and transparency will impact Data Governance and organizational behavior. Many companies that have embraced Data Governance have stopped short of embracing x-organizational governance bodies with real authority. Most don't know which models to follow, examples of success to emulate, how it should work.
In my travels I've seen many governance models in corporate and national entities that offer some hope to modern organizations, and I think we ought to be the Council that inventories these models, compares their pros and cons, and presents alternatives to hierarchical organization.
4. Data Risk Standards: In the Xiao Dynasty in China, rulers practiced Risk-based decision making by consulting an Oracle, who dropped an Oxen hip bone on the floor and deciphered the direction of the crack in the bone as indicative of divine truth. If the crack pointed up, you had good favor for your decision, down, well you better ask again. People consulted the Oracle on every kind of decision - dental surgery, marital options, taxation, or war - and they would drop 6-9 ox bones and average the results, thinking that more data would provide more accurate results. Every question to the Oracle was journalized, and outcomes were constantly compared to the ox-bone forecasts. Records of these inquiries survive today, providing the oldest known risk forecasting models. Three thousand years ago, this was the first form of risk-based decision making, and while it may seem primitive to us it was at least systemic which is more than we can say about ERM practices today.
Enterprise Risk Management today is still a voodoo art practiced by a secret society of Risk Managers in a language few understand. It is expensive, bespoke, non-standard, and under-utilized. Market, Credit, or Operational Risk consequences are not understood by the vast majority of employees who make enterprise decisions because none of them have access to even Oxen bones today, let alone risk-based forecasting models that allow decision makers to compare options, forecast outcomes, and compare results to the forecasts.
To get to that state, where ERM is a common discipline that every employee can use for enlightened decision-making, new Data Risk standards are needed, to make ERM simpler, cheaper, and more systemically repeatable, and that is another contribution this Council can make. We will next meet on June 26th at the Federal Reserve in Washington, DC to explore that opportunity in depth.
What was evident at this meeting is that Data Governance challenges have changed in three years. We are still at the cusp of changes in the way modern, post-industrial, organizations are governed. Even the most mature members of the Data Governance Council have not substantially changed the way their organizations perform decision-making. It is still top-down, barely delegated, with little or no trust extending from the top to the bottom of an organization. Many governance bodies or teams have little or no direct decision-making authority - neither funding mandates nor project veto powers. The light of information still shines brightest from the bottom-up, with those at the top getting the best view of the light and those at the bottom simply blinded by it.
We need new models of organizational governance, new data standards in ERM, and renewed investment in risk-based decision making at all enterprise levels. This remains the challenge of Data Governance in the early adopter market evolution.[Read More]
In my notes from our last Data Governance Council Meeting, I commented on the continuing confusion in many organizations on where Governance below the board fits and how it works. I continue to hear questions about "Business" and "IT", how they should get along and contribute to Data Governance.
In my own presentations on Data Governance, I often talk about the "Six Question Every Company Should Ask about Data Governance" and #1 is always
"Do we have a Government?"
And, I often go on to show a top-down hierarchical Governance model and talk about the need for functional and operational charters to define governing powers. But, to be self-critical, I haven't help define the questions that are central to governance; ie, where does it fit and how does it integrate within the organization.
A few months ago a colleague in Germany sent me an interesting white paper called Holacracy. It was long and boring looking so I let it sit on my desk for three months before reading it seriously. The paper, and the Methodology, is the work of a small company called Ternary Software and Holacracy can best be defined as a new method of x-matrix communication and decision-making in a more formalized manner.
There are many aspects of this model that are, in my opinion, hopelessly idealistic in large organizations. But there also some extremely useful ideas that I do think can help large and small organizations to better integrate Data, IT, and even SOA Governance in context to the culture and expectations of "The Business." There are three ideas in this model that I think are especially important to Data Governance:
Double LinkingIntegrative Decision-MakingDynamic Steering
A few excerpts from one of the papers to whet your appetite:
"Double-linking - Circles are not fully autonomous; each circle is linked to the circles above and below it by at least two people who participate in the decision-making of both circles. One of these two people (typically the manager of the business unit or department, called the “lead link”) is appointed from above. The lead link is responsible for representing the needs of the higher-level circle, and is accountable for the lower-level circle’s results. The other person, called the “representative link,” represents the needs of the lower-level circle at the higher-circle meetings.
In this way, Holacracy achieves bi-directional process control by ensuringthat each circle’s decision-making process takes into consideration the needs of its linked (higher-level and lower-level) circles."
"Integrative decision-making - This principle begins with the premise that all perspectives hold some intrinsic value, some piece of the truth. Or as integral philosopher Ken Wilber puts it, “No one is smart enough to be 100 percent wrong.”
Holacracy’s goal is tointegrate perspectives as quickly and effectively as possible. Integrative decision-making is designed to prevent people or teams from overriding an important perspective simply because they don’t see its value. Robertson offers the example of when he nearly crashed his airplane because he ignored the information on the low-voltage indicator. “All the other gauges looked fine to me, so I thought it wasn’t important,” he said. He was nearly dead wrong.
This pattern shows up all the time in modern organizations. With everything moving so quickly, it’s easy to neglect the perspectives of others. Yet, like the voltage indicator, one of those perspectives might hold the key to a more complete picture of realityone that could lead to a more effective and powerful decision, perhaps a crash-preventing decision."
"Dynamic steering - The idea that any decision can be revisited at any time is central to dynamic steering, a practice designed to help organizations learn and adapt quickly in a complex, rapidly changing environment. Essentially, dynamic steering is an agile way of guiding the organization toward its goals by holding an aim in mind, staying attuned to emerging reality and making frequent course corrections along the way."
I had to read this paper a couple of times to get over my own skepticism. Some of Holacracy reads new-age loony and I can imagine some Wall Street razor-elbowed executives using this paper to wipe up dog poop on the sidewalk. But read it again, and then change the names of what this stuff is called and imagine trying some of it, slowly, where you work. I think some of these ideas have the potential to turn the Governance part of DG into a more productive process.[Read More]
DataGovernor 120000GKJR 1,583 Views
Denmark is always at the vanguard of new trends and if you are into free music Denmark is the place to be, though you won't have to wait all that long for this trend to cross the Atlantic. In the battle to win broadband customers, TDC (formerly TeleDanmark), has signed deals with major recording companies to provide free access to MP3 files over its network. In a way, all that TDC is doing is paying the music companies a volume royalty on their music portfolios - just like radio stations do today to BMI and ASCAP, as intermediaries to the recording industry.
But of course, by selling music for broadcast royalties, TDC is not equating broadband to radio. Rather they are shrewdly taking advantage of a reality the music industry is just acknowledging - the price of music data people are willing to pay today is next to nil when it is infinitely redundant, of consistent quality, and immediately available.
If you can read Danish, have a look, though I doubt you will have to wait too terribly long to discover an ISP in your neighborhood making the same announcement in your native language...
The IMF put out the Global Financial Stability Report last week and it contains a very accurate and sobering description of the systemic failures involved in the Subprime Financial Crisis. It has an institutional focus, and makes some solid observations and recommendations.
The entire report is worth a read, but the Executive Summary contains most of the key points if you just want the meat of the matter:
I will summarize the findings and recommendations that have Data Governance implications:
"The events of the past six months have demonstrated the fragility of the global financial system and raised fundamental questions about the effectiveness of the response by private and public sector institutions. While events are still unfolding, the April 2008 Global Financial Stability Report (GFSR) assesses the vulnerabilities that the system is facing and offers tentative conclusions and policy lessons.
Some key themes that emerge from this analysis include:
• There was a collective failure to appreciate the extent of leverage taken on by a wide range of institutions—banks, monoline insurers, government-sponsored entities, hedge funds—and the associated risks of a disorderly unwinding.
• Private sector risk management, disclosure, financial sector supervision, and regulation all lagged behind the rapid innovation and shifts in business models, leaving scope for excessive risk-taking, weak underwriting, maturity mismatches, and asset price inflation."
What follows are a number of short- and medium-term recommendations relevant to the current episode. Several others groups and for a—such as the Financial Stability Forum, the Joint Forum, the Basel Committee on Banking Supervision—are concurrently developing their own detailed standards and guidance, much of which is likely to address practical issues at a deeper level than the recommendations proposed below.
In the short term...
The immediate challenge is to reduce the duration and severity of the crisis. Actions that focus on reducing uncertainty and strengthening confidence in mature market financial systems should be the first priority. Some steps can be accomplished by the private sector without the need for formal regulation. Others, where the public-good nature of the problem precludes a purely private solution, will require official sector involvement.
Areas in which the private sector could usefully contribute are:
• Disclosure. Providing timely and consistent reporting of exposures and valuation methods to the public, particularly for structured credit products and other illiquid assets, will help alleviate uncertainties about regulated financial institutions’ positions.
• Overall risk management. Institutions could usefully disclose broad strategies that aim to correct the risk management failings that may have contributed to losses and liquidity difficulties. Governance structures and the integration of the management of different types of risk across the institution need to be improved. Counterparty risk management has also resurfaced as an issue to address. A re-examination of the progress made over the last decade and gaps that are still present (perhaps inadequate information or risk management structures) will need to be closed.
• Consistency of treatment. Along with auditors, supervisors can encourage transparencyand ensure the consistency of approach for difficult-to-value securities so that accountingand valuation discrepancies across global financial institutions are minimized. Supervisorsshould be able to evaluate the robustness of the models used by regulated entities to value securities. Some latitude in the strict application of fair value accounting during stressful events may need to be more formally recognized.
• More intense supervision. Supervisors will need to better assess capital adequacy related to risks that may not be covered in Pillar 1 of the Basel II framework. More attention could be paid to ensuring that banks have an appropriate risk management system (including for market and liquidity risks) and a strong internal governance structure. When supervisors are not satisfied that risk is being appropriately managed or that adequate contingency plans are in place, they should be able to insist on greater capital and liquidity buffers.
In the medium term...
More fundamental changes are needed over the medium term. Policymakers should avoid a “rush to regulate,” especially in ways that unduly stifle innovation or that could exacerbate the effects of the current credit squeeze. Moreover, the Basel II capital accord, if implemented rigorously, already provides scope for improvements in the banking area. Nonetheless, there are areas that need further scrutiny, especially as regards structured products and treatment of off-balance-sheet entities, and thus further adjustments to frameworks are needed.
The private sector could usefully move in the following directions:
• Standardization of some components of structured finance products. This could help increase market participants’ understanding of risks, facilitate the development of a secondary market with more liquidity, and help the comparability of valuation. Standardization could also facilitate the development of a clearinghouse that would mutualize counterparty risks associated with these types of over-the-counter products.
• Transparency at origination and subsequently. Investors will be better able to assess the risk of securitized products if they receive more timely, comprehensible, and adequate information about the underlying assets and the sensitivity of valuation to various assumptions.
• Reform of rating systems. A differentiated rating scale for structured credit products was recommended in the April 2006 GFSR. Also, additional information on the vulnerability of structured credit products to downgrades would need to accompany the new scale for it to be meaningful. This step may require a reassessment of the regulatory and supervisory treatment of rated securities.
• Transparency and disclosure. Originators should disclose to their investors relevant aggregate information on key risks in off-balance-sheet entities on a timely and regular basis. These should include the reliance by institutions on credit risk mitigation instruments such as insurance, and the degree to which the risks reside with the sponsor, particularly in cases of distress. More generally, convergence of disclosure practices (e.g., timing and content) internationally should be considered by standard setters and regulators.
• Tighten oversight of mortgage originators. In the United States, broadening 2006 and 2007 bank guidance notes on good lending practices to cover nonbank mortgage originators should be considered. The efficiency of coordination across banking regulators would also be enhanced if the fragmentation across the various regulatory bodies were addressed. Consideration could be given to devising mechanisms that would leave originators with a financial stake in the loans they originate."
New standards and banking practices will clearly be needed moving forward. But we already have most of the regulations we need to mitigate most risks identified in the report. Indeed, one of the great ironies of the crisis is how little Banks used their own fraud and risk management systems to catch underwriting errors and omissions in Loan Origination applications, House Assessments, risk capitalization, etc.
I suspect that the IMF's warning on regulation will not be heeded in Washington, though I do hope regulators will listen to the seasoned advice of some Data Governance veterans because this is a crisis with so many Data Governance challenges.[Read More]