On September 18, 2007 the US Federal Reserve cut the Federal Funds rate by half a percent in response to the looming sub-prime loan scandal. The markets had lost confidence and Banks were holding debt they could not sell. Write offs ensued, and the market forecast looked questionable at best.
At the time, this rate cut was seen as a dramatic response to worsening market conditions and proof that the Fed would act aggressively to protect the economy from the housing bubble. In the next two months, the Fed intervened again to cut rates .25% in October and .25% again in December. Each rate cut was seen as a prudent response to market conditions.
In January 2008, just a few weeks after the last rate cut, the Fed had to intervene again with a very sudden 1.25% cumulative rate cut to stem an asian-driven equity market sell-off following more sub-prime write offs and loss disclosures. In just five months, the Federal Reserve had to intervene five times with a combined interest rate cut of 2.25% following 17 quarter point rate increases in as many months.
This was an incredible see-saw of macro-economic policy - gradual rate increases were followed immediately by sudden rate cuts. In hindsight, the half-point cut in September 2007 was not very dramatic in comparison to the 1.75% in cuts that followed in the next four months. No one then could have foreseen the volatility in the markets that was to come, or could they have?
Why is it that the US Federal Reserve rate policy was reactive to market volatility? Why didn't their monetary policy, which had run up rates from 1% in June 2003 to 5.25% in June 2006, anticipate the looming housing bubble and bank losses that would surely ensue? Hadn't Alan Greenspan warned of this outcome in 2005? Didn't we all know the housing joyride would end at some point?
Today, we can see banking and financial market data that shows the risk trends in our rear view mirror. Unfortunately, no one has a mirror that forecasts the future, but they could if capitalized risk data were collected on a systemic basis by banks and shared with the Federal Reserve. The Federal Reserve does an excellent job of studying catastrophic risks and running sophisticated macroeconomic loss models on everything from terrorist attacks to coastal hurricanes. The Fed uses this catastrophic loss data to provide capitalize insurance loss reserves for the US economy - ie, they print more money when very bad things happen.
The insurance reserves got tapped after 9/11 and hurricane Katrina, when the Fed injected huge amounts of liquidity into the economy to stabilize markets and restore confidence. Of course, the timing of catastrophic events can't be forecasted, but the monetary response can be estimated based on a variety of risk factors. the fed constantly analyzes and wargames these risk factors and the success of Fed liquidity and monetary responses to 9/11 and Katrina attest to the diligence of their planning and the value of risk-based forecasting models.
What does this have to do with the sub-prime loan meltdown you ask? Well, if the Fed had non-catastrophic risk-data forecasting models they could possibly pre-empt loss events with macroeconomic policy tools that could even out some of the worst aspects of the business cycle. Unfortunately, that kind of non-catastrophic risk-data has to come from banks, who until recently were totally incapable of providing that kind of data, let alone using it themselves for their own risk-based policy-making.
That's changing. In the last two years banks around the world have been working to assess and collateralize market, credit, and operational risks as part of the Basel II compliance process. That data isn't normalized across banks, and there are wide disparities in how risks are assessed, calculated, and capitalized from bank to bank, country to country. But the raw data, and the beginnings of the knowhow are, for the first time in history, there. And that data and knowhow can be leveraged to provide new macroeconomic tools for Central Bank policymakers around the world.
What's needed are standards in risk assessment, classification, calculation, and the reporting of capitalized risk data from US banks to the Federal Reserve. This may take some years yet to accomplish but the time is right to begin discussing these issues. As US Banks reach Basel II compliance they will be in a position to leverage risk-data for their own self-insurance against non-catastrophic losses, and if they would be willing to share their capitalize risk data they could help the Federal Reserve to reduce market volatility and improve macroeconomic performance for everyone.
Here's a case where regulatory compliance really can improve business performance.[Read More]
Adler on Data Governance
DataGovernor 120000GKJR 1,709 Views
I gave a speech at the Telestrategy's ISS World 2007 Conference (http://www.telestrategies.com/ISS_WASH/index.htm) in Alexandria, VA yesterday. The conference topic was Data Fusion - the use of data mining technologies for law enforcement and anti-terrorism. I spoke at a similar conference two years ago and was looking forward to meeting with that group again, but this crowd was far different and the industry has matured rapidly. Two years ago, I confronted a room of 800 regional law enforcement officials from 48 data fusion centers in the US. And I was on the agenda presenting Data Privacy. It was awkward at first, but after a few minutes we had fun together talking about the push and pull of government data mining and the protection of privacy and civil liberties. It was a group of concerned citizens trying to harness new technologies to make law enforcement more efficient but each also had their own individual concerns about how their work might endanger US privacy rights. So we found common ground and both this presenter and the audience learned something from the exchange.
This year, I confronted a small conference room with 45 people from military contractors, DIA, CIA, DHS, and a bunch of Israelis who were pretty reticent about what they did for a living. My topic was Data Governance, and aside from some technical questions from a guy working for Ratheon, no one else in the room seemed terribly interested.
In the expo hall, I discovered why. Congress and Washington's privacy elite might think that debating warrant-less wiretaps and FISA Court obfuscations are vital to preserving data privacy, but what I saw in the Expo room in Alexandria persuaded me that discussion is public posturing at best, or a charade for the ignorant at worst. The privacy cat is out of the bag, and the data fusion industry has found many market-oriented, privatized, and convenient workarounds to do what they think needs getting done with very little judicial, congressional, or constitutional oversight.
Case in point: I met a company called Spectronic. They make a communication interception technology that uses cell phone triangulation technology first developed by carriers for mobile 911 service. They sell it to public and private security services for communication monitoring during events. I can appreciate the utility of this technology.
The impetus for developing it lies with the fact that working with courts and telcom carriers is a tad slow and inconvenient where criminals and terrorists are concerned. So instead of relying on our beloved privacy preserving telcom carriers to provide triangulation and tracking of suspect cell phones at an event, law enforcement can entirely avoid that unpleasant process. They can just purchase a few of these oven-shaped boxes from Spectronic and deploy them along the perimeter of any event and instantly watch everyone's cell phone voice, data, email, location, etc, dragnet style.
Very efficient, and this was just one booth in a hall full of spy toys and spies...
Privacy Professionals take note. The rapacious marketplace, burgeoning Homeland Security budgets, and the privatization of government, are making our efforts vainglorious at best.
As I left the Spectronic booth the very nice sales rep shrugged her shoulders and told me "it's the world we live in..."
I replied "no, it's the world we create."[Read More]
DataGovernor 120000GKJR 1,280 Views
Recently a colleague sent me a very interesting article written by Wim Van Grembergen of the IT Governance Institute, entitled "The Balanced Scorecard and IT Governance." You can find this paper in PDF format here: http://studies.hec.fr/object/SEC/file/A/WPNAQMCMKBJNBLEYHCGTKWNNXVBNNMFG/balscorecard&IT%20governance.pdf
I recommend reading it because it does provide a high-level introduction to the topic of Governance from a very IT perspective. What follows is my own critique of the thesis of this paper and why I think it points IT in the wrong direction.
Overall, I think this is an interesting paper with some inaccurate interpretations. It is more of a Balanced Survey than a Balanced Scorecard and it assumes a very hierarchical, industrial, organizational structure in Governance that I think is at odds with the way organizations really function in the post-industrial Information Economy.
In the industrial model, the flashlight of corporate information shines up. Those closest to the bulb are blinded and only those at the top can see the light at its widest aperture. That is a model for control, not innovation. Control was incredibly important in the industrial age because production cycles were long and stability of resource and labor supply was critical. That was an age of Monopoly and Economy of Scale.
Today, we live in an age of intensifying global competition, shrinking product lifecycles, very low barriers to market entry, and enormous complexity in our financial, consumer, and internal markets. Governance has emerged as an organizing force below the Boardroom because every employee, customer, business partner, and associate is a potential source of innovation and innovation thrives when the light of information is spread evenly across an organization and everyone can appreciate its radiance.
Net: We are experiencing the beginnings of great changes in the modern corporation, and Governance (Data, IT, SOA, Whatever) below the board is an early manifestation of this emerging trend. Good that others are recognizing and writing views on Goverance as a legitimate management discipline, but the approach described in Van Grembergen's paper is rooted in industrial models of organization that are already giving way as corporations and nations adjust to the Information Age that is already upon us.[Read More]
In April 2007, the International Monetary Fund revised its Fiscal Transparency Code of Conduct as a set of recommended practices for governments around the world. The four pillars of the Code are:
- clarity of roles and responsibilities, - open budget processes, - public availability of information and - assurances of integrity and data quality
While the Code was written for governments, Fiscal Transparency has many market benefits to businesses too. Companies building Data Governance Boards might want to review this Code of Practice as guidance for constructing functional and operational principles in their Charter. It shouldn't be lifted literally, but there are many good ideas here that can be applied.
DataGovernor 120000GKJR 1,353 Views
Everywhere I go, I meet companies building Data Governance Boards, assessing their situations, creating strategies, and looking for solutions. Data Governance today is the most prevalent form of Governance below the Board of Directors, and the only form that often brings together IT and Business leaders in one continuous dialog. Three years ago, when we first adopted the title "Data Governance" to describe the pending convergence of Risk Management, Security, Data Quality, ILM, and Compliance, no one understood what the two words put together meant. I often heard "Data What?" in response to my presentations at conferences. Today, however, there is no question that the conjunction of "Data" and "Governance" defines an exciting new marketplace of common challenges, new ideas, and exciting solution opportunities.
But the future of Data Governance depends more on the vitality of the political institutions now being formed. Data is the easy part, it is Governing that is hard. In the coming years, many solution architects and IT consultants will focus on quick Data solutions, and most Governing Boards will flounder without technology support that creates an institutional framework for Governance.
We in the IT industry will focus 80% of our solution sales on tools to govern data, but it is data to help humans govern that is a far more pressing need. In every industry (especially public sector) we need better Governance solutions that help human beings analyze massive amounts of operational information, assess the quality and value of information and use Risk Calculation to forecast options and make decisions.
What are needed are new solutions to help organizations, large and small, govern more efficiently. Our businesses and our governments need these tools, and technology has an important role to play in helping organizations transform from industrial to information models of production and value creation.
Those solutions should help democratize the governance process, create new levels of organizational transparency, help forecast and model potential outcomes, capture and communicate key policy decisions and compare them to results.
This week, I attended the Global Forum 2007 in Venice, Italy. I am member of the Global Forum Steering Committee, and through the Data Governance Council IBM was a sponsor of this year's event. The meeting took place on a private Island off the Grand Canal in Venice. The Island had been home to a monastery, which is now used by the Giorgio Cini Institute for Music and Art. The meeting rooms were spectacular, with scores of Veronese paintings adorning the walls, columned cloisters, and magnificent paneled rooms. Participants included the Mayor of Venice, the former Prime Minister of France, the Chancellor of Geneva, commissioners from the FTC and FCC, presidents of Universities, and 5 members of the IBM Data Governance Council...
This was my third Global Forum event. I chaired a panel on Data Governance, and gave a brief presentation on global competition, innovation, and governance. Both were extremely well received, with many commenting that our panel on Data Governance was the most substantive and interesting of the conference. I owe a special thanks to Ed Keck, Richard Livesly, Cengiz Barlas, Paul Welti, and Jacques Bus for their fantastic presentations. On the evening of the first night of the event we had a private chamber concert at the Venice Opera House, a beautiful gilded Colosseum. Following the concert (Hayden and Tchaikovsky), there was a dinner in a private dinning room with a pianist. It was lovely and inspiring. I dined with the very charming CIO of San Francisco and the Deputy Mayor of Paris.
At the end of each year's Global Forum I am reminded of how very complex and difficult such a conference is to organize. It is not just the fantastic venues or beautiful entertainment and dinner. Most importantly it is the arrangement of all the various interests and specialties that such a global network brings together in one spot for two days. This is no small feat, and I have learned from the event's very special host, Sylvianne Toporkoff, that networking is an art and she is the absolute master.
This year's Global Forum was a triumph. It reached across the international divide and brought more leaders from Asia and the US, business and government, than at any past event I have attended. But what I also saw this year was more transatlantic tension, misunderstanding, and competition than before. For this German/Danish-American, those are troubling trends indeed. They simmer below the surface, and come out in subtle phrases and indirect cuts. But they are there and they threaten many things we all believe in. Next year I hope the Global Forum will take these issues on thematically in because it is only through direct discussion that substantive understanding can be reached. Truth is we need more global forums, as the world is growing ever more competitive.
On November 2nd, I attended a Law School advisory board meeting in Koblenz. The chair of our board is a senior executive of UBS Germany and another board member is the head of M&A for Deutsche Bank. While the topic of international banking was quite far from our subject, we quickly spilled over into discussions about risk, globalization, loss and reputation. The sub-prime credit squeeze is affecting financial institutions worldwide and what I discovered is that the frustration in Europe on this topic runs high, as many feel hostage to economic policy made far from these shores. Economic interdependence is not new. Every day, stock markets rise and fall based on market sentiments in New York, Hong Kong, Frankfurt, London, and Shanghai. What makes this crisis different is that the debt purchased was well rated by US rating agencies and eagerly acquired by banks across the globe. Today, those debt instruments have few buyers, creating hordes of worthless debt that is being written off balance sheets across the globe, resulting in one of the first global financial crises of the 21st Century.
The recent Fed rate cut seemed to underscore the limits of unilateral action in a global economy. As capital moves with ease across borders so too will, in my opinion, economic policy need to demonstrate more fluid international action. Of course central banks some of these relationships regarding crisis management, but economic policy has a fiscal dimension and G8 summits are too irregular for ongoing stewardship. Consistent economic growth on a global level may require new international governance structures combining central banks and governments so that all levers of policy can be adjusted to better balance money supply, regulatory controls, tax and subsidy without compromising national competition.
Such ideas may threaten isolationist neocons, and may also take more crises to instigate, but we can't say we have a global economy if our policy tools are mostly national. Financial disruptions as witnessed in the past months will continue just as national liquidity and credit crises caused runs and convulsions at the dawn of the 20th century. Those crises resulting in the formation of the US Federal Reserve Bank, a central governance mechanism to control the money supply as a means to steward national economic stability.
We are just beginning to understand the role of Governance in world affairs.[Read More]
I flew to Las Vegas on Sunday evening. It had been a few weeks since I had last boarded an airplane and I was excited to go anywhere, even Las Vegas. The flight out was great, but the airport was like Shinzuku Station at 7am when I landed at 11pm. One-legged trolls must be hand-carrying the bags from the planes to baggage claim because it took a nervous and smokey hour to reclaim my luggage. Welcome to Las Vegas.
I was there to participate in the IBM Information on Demand Conference. This event was so good I completely forgot how much I despise that sunny city. I really don't know why anyone needs to have a gambling mecca in a city with nice weather because the hotels make it practically impossible to even see the sun on most days.
That aside, IOD was terrific. The sessions were excellent. I spoke in so many different sessions on Data Governance that I barely had time to hear other speakers. My sessions were packed with passionate participants. I took home fist-fulls of business cards and was deeply impressed with the knowledge and interest of my audiences. There were some sure who weren't doing Data Governance yet, but they were the minority and the questions and interest demonstrated to me that Data Governance is now a market that is aware of itself. Things are happening independent of IBM. The demand is there across industries. It is our opportunity to lose; customers and business partners are keenly interested in joining the Data Governance Council and learning about the Maturity Model.
For me, the event was a vindication of all the hard work the Council has put in the past three years. My congratulations to the entire IBM IOD team who put on this excellent event.[Read More]
I'm not normally a reader of USAToday. The colors and quick articles always feel superficial and empty to me. But I was caught in a hotel restaurant this evening, on the last night of a long road-trip, and the only intellectual distraction was the pastel paper left in my hotel room. Some normal blather on the front page, and most of the paper was forgettable. But on the last page, there was an opinion piece by Alan Webber which perfectly described my own fears and experiences travelling abroad, looking back at America through the eyes of our long despairing friends in Europe. I commend this article to anyone who cares deeply about how far we have drifted from what we once were:
On September 25th, I hosted an ISACA e-Symposium on Data Governance. ISACA reports there were close to 3000 people registered for the webinar. It was conducted live, over IP and VOIP. I gave an introductory presentation on Data Governance, and we had excellent presentations from Bank of Montreal, Key Bank, and Discover Financial.
This call represents the first time the Data Governance Maturity Model has been shared with such a large audience. Participant feedback was excellent. We had many great questions on the line, and hundreds more were sent in via email.
Enclosed are some of those questions and my answers (names withheld to protect privacy):
Question: "In your opinion which one can be more cost effective and considered best approach? A. Invest in the development of a Data Governance Programme as a separate entity. OR B. Leverage existing asset management processes, such as ITIL and ISO27001 to accomodate Data Governance?"
Answer: Data Governance is more than asset management, and one the key problems we've been trying to solve is more political than technical - it's how to get many different people from different disciplines to work together and solve complex problems. Some of these problems involve managing data as an asset, but some also involve managing the risks to the assets, being rigorous about policy definition, developing stewardship programs, storage and discovery rules, metadata and compliance.
I would always tell my customers that Data Governance is a new way of thinking about old problems and strategically should be integrated alongside existing models, like ITIL, that already work and are understood. But it isn't one or the other. It's both.
Question: "How do you impress top management with the importance of Data Governance?"
Answer: Data Governance is as much a political challenge as it is technical. Two things are going to get your program off the ground fast - acknowledged deficiencies across the organization that management can easily understand and quantify, or a determined sales program on your part to sell Data Governance as a solution that can solve many individual problems. In either case, start with executive interviews. The Data Governance Council Maturity Model can provide a framework for the interviews by segmenting issues and suggesting natural assessment questions. Whether you use the Maturity Model or develop your own questions, you have to do the legwork to discover their needs, classify the opportunities, and develop an internal sales strategy to rally your support.
Question: "data or information is intangible. Is there any specific model or method to quantify the value of data and information?"
Answer: The value of anything is determined by its price. There are many different ways of setting price. Markets set prices for stocks, bonds, and other financial instruments. Manufacturers set prices for cars, refrigerators, etc. Countries set prices on taxes, trade duties, and government services. IT can set prices on data, which is not really intangible. You pay for it all the time, when you buy a newspaper, rent a DVD, purchase software, why even your monthly broadband bill is a contract for data services. What we have yet to accomplish in internal IT is more specific mechanisms to establish the value of data based on its utility, demand, and ultimately price end-users will pay for it.
Question: "You mentioned that 10 companies have adopted the Maturity Model in some form. Can you identify any of them and speak to the results they've experienced?"
Answer: No, I can't. Those companies will identify themselves when they are ready to go public. But I do hope that the Maturity Model, through venues such as ISACA, can inspire a broader public discussion about Data Governance and successful implementations.
Question: "Love the notion of allocating costs for Content Level Agreements & Alternative Risk Transfer Agreements! What is this seen as a separate focus rather than being totally integrated with IT Governance (e.g., may be seen as extending some CobiT control objectives)"
Answer: As I said in my presentation, IT today is run like a Command Economy, with projects centrally funded and managed and no real economic tools to modify user behavior regarding perceived value of IT or need to mitigate risk. Internal funding agreements like Content Level and Alternative Risk Transfer are new economic policy alternatives that business can use to price and sell data internally based on the business demand for quality, availability, integrity, and security, as well as "tax" business units for the losses they create. I hope businesses will begin to leverage economic tools like these to turn the IT department into a P&L Center, and represent the aggregate internal IT rates of return in the financial balance sheet.
Question: It is considered best practice to hold end users or local managers responsible for data accuracy - is data governance an attempt to centralise this concept?
Answer: I think Data Governance is ultimately successful when it pushes organizational responsibility and policy obligations out from the center to every employee. Look, we buy gas, pizza, clothing, and other consumer goods every day and we don't need to consult with Congress or carry law books with us everywhere we go to conduct those transactions in a lawful way. We have, as a society, learned to conduct business in lawful ways that are for the most part free of vice, corruption, and crime. We call this civilization, the rule of law, etc, and these are examples of self-governance. We need to ultimately achieve the same degree of self-governance in our own organizations, employees who all understand their obligations to govern their use of data appropriately.
Are we there yet today? Not in all cases, and we still need central institutions to create policy and push compliance out to the organization. But this is the goal we should strive for - delegated responsibility and accountability.
Question: "have you defined "standard" quantitative measures to assess data governance maturity or data quality?"
Answer: The Data Governance Maturity Model does define five levels of DG Maturity, and insofar as those levels can be seen as quantitative the answer is yes. In the real world, it's not so simple. Maturity is relative to peers in an industry, and what is today to be considered a mature state at say level 2 might tomorrow be considered immature. Ultimately, it is for every company on their own to determine what the levels mean to them and what goals they need to set to achieve their maturity. We'd discussed this many times in the Data Governance Council, especially on the topics of what is mature and how many categories in the Model should everyone use. In the end, we decided we should let the market decide and the best thing we can do is collect implementation examples and share them with other practitioners to allow everyone to pick and choose the categories and levels that best meet their needs and culture.
Question: "Not really a question... It would seem that the processes that transform data into information (or information into organizational knowledge) must also fall under the control of general data governance, since it is possible to take perfectly sound data and transform it into bad information."
Answer: I get this question often. Many people think that data is in a database, and when human beings use it it becomes Information. I personally think this is applying industrial assembly line metaphors of production to information. In some ways this is a rather vain metaphor, because we humans like to think we improve on data when we transform it in our brains into information. We are better after all than mere machines. But of course, humans also degrade information, on a regular basis, when we use it. So data can also become pollution when put into human production. We have enormous stockpiles of data pollution throughout the internet. :-)
In the end, I don't think these distinctions add much value to the challenge of Governance.
Data=Information. These are synonymous terms from a policy perspective, because ultimately the data/information has to be stored someplace. And the policies we write are intended to govern how human beings, and computers as their tools, control that data/information where-ever it is stored - in a database, on a web page, in a spreadsheet, in a video, on a printed page, or retained in a person's memory. Policy should apply, and stewards should enforce, regardless of storage medium, and what we should be more concerned about is metadata to describe more distinct attributes of data/information, like its quality, integrity, reliability, business uses, past modifications, etc. With these tools we can better apply Policy to data wherever it resides, however it has been improved or degraded by humans and machines.
Question: " You have indicated that there are two avenues to pursue to obtain compliance, reward vs punishment. Which process have you found most effective or a combination of both for global enterprise?"
Answer: I don't think I called them reward vs. punishment. I think I said that an governing power has a few fundamental policy instruments - to make things cheaper, legal, or easier to do, or to make them more expensive, illegal, and painful to do. Both levers have pros and cons. And both have different effects on human behavior given different circumstances. I don't advocate one vs another. Human beings have to choose their policy tools and how each best fits their policy goals. Like our own Congress, trial, error, and evolutionary improvement are still the only model we can deploy to guide policy. In the future, however, I do hope we can develop better technology tools to help policy makers analyze different variables, model potential outcomes, and determine the best policy mechanisms for each challenge, and measure results based on forecasts.
Question: "Is data classification accross the organization a key element for Data Governance?"
Answer: Emphatically YES! Most Data Classification is a blunt Security-based tool. We call data Top Secret, Secret, Classified, Public, etc, never indicating much about its business uses, quality, integrity, storage location, etc. We need business glossaries to understand business definitions and we need to link these to technical metadata to enable policymakers to search for policies, data assets, and exposures across our enterprise like we today search for news, ideas, and communication on internet search engines. We need a broader view of metadata and Data Classification and while business people may never fully understand this area of IT, we need to develop better tools to enable them to use it without having to understand it.
Question: "You cited USA laws and regulations. What about leveraging on different areas (Europe, Asia) where you have different ones for multinational public companies ? Besides what about financial risks, like different currencies and related fluctuation in outsourcing, offshore, etc.?"
Answer: You are right, all these are equally important issues. We are probably just mid-way through an IT regulatory cycle that began seriously with the EU Data Protection Act of 1996 and the HIPAA Act of 1998. SOX, Basel II, PCI, SB1386, and so many more regulations are changing the nature of IT development and deployment. Just as at the dawn of the 20th Century, when governments around the world passed industrial regulation, so too today are our countries grappling with the best way to regulate the impact of IT on our societies. I do wish that countries would make technology a cabinet level policy position, because we need better IT advice in public policy-making.
Question: "what measure is put in place to encourage data governance and privacy law compliance in africa?"
Answer: "Good question. I don't know. But I will look into that and write about it on a later blog."
Question: "Could you talk more about selling this approach to clients? What method do you use to persuade them not only to the general concepts, but also to really invest in going down this path?"
Answer: Most of the clients we deal with are already sold on the need for Data Governance. Three years ago, when we started the Data Governance Council, the numbers of believers were very small. That's why we organized the Data Governance Council - to gather together the innovators and early adopters and build a community that could learn from each other and synthesize that knowledge into methods the broader marketplace could adopt. The Data Governance Maturity Model is the product of this process, and I would encourage every company interested in Data Governance to explore it's potential. While I can't publish its contents here, I will tell you that it is extremely detailed - 11 categories, with many sub-categories, all with 5 levels of maturity. It is an excellent tool to model a Data Governance program and benchmark internal practices against levels of maturity created by industry peers.
Question: "Could you provide a link to Data Governance Counsel?"
Question: "Any example companies that have implemented an ART approach to charging user departments for risky IT behavior and how has that gone?"
Answer: Any large bank complying with Basel II and using the Advanced method of operational risk calculation already have the methods in place to create an internal market for Alternative Risk Transfer. They could even, potentially, setup their own Self-Insured Retention to "pay" out internal losses based on the "premiums" collected from their organizational stakeholders. In reality, every company already self-insures against their own IT and operational losses. The problem is often that these losses are not recorded in a systemic way, the information is not analyzed to detect loss patterns, and few organizations have the actuarial mechanisms to leverage their loss data to forecast future exposures. But all of this is business as usual for any large E&O insurer or Basel II conforming bank.
Question: "How to justify penalizing business for data incidents as the common perception is that IT department is responsible for taking care of data?"
Answer: Anyone carrying a Blackberry with customer data should be responsible for taking care of data. Data doesn't just live on a green-screen connected to a massive mainframe any longer. It's mobile, its everywhere. And every employee is creating and exposing it to value and harm. That's why Governance is a group activity involving stakeholders from IT and Business. Everyone is responsible and therefore you need everyone involved.
Question: "How is the model accessible? Is it possible to buy/download it somehow?"
Answer: Not yet. We'll have to look into that.
Question: "Which organizational model is best suited for Data Governance?"
Answer: In the short run, it's the one that best fits your organizational culture. In the long run, in globally integrated enterprises with employees in every timezone, working from home or on the road, I think we will need more distributed organizational models and I look forward to inventing that next.
Question: "Would COBIT be an appropriate reference to implement data governance in terms of how to?"
Answer: COBIT would be an excellent reference if that's what your company is already using. We have so many alphabet standards today that don't talk to each other. When you implement Data Governance in your company, try to bridge reference standards as you also try to bridge organizational stovepipes. They have the same effects to divide and separate people and what you need is to bring people together.
Question: "With regart to using ART, how do you avoid the pitfalls of departments getting into "fingerpointing" arguments with one another where more resources are spent on blamimg each other for the cuase of the data integrity/quality issue rather than actually addressing the root cause."
Answer: Let each department determine it's own root causes for loss. What you care about is the levying the financial premium for the loss. The payment itself is an incentive to fix the problem.
Question: "Have you distinguished the difference between data and information in the studies you have conducted? Data becomes information when it is synthesized or crunched in a system and then reported as information. ....Data in...Information Out... Where is the starting point of governing data and when do other IT governance models take over? When data becomes information? Thank you."
Answer: We've discussed this distinction many times in the Data Governance Council and we've always agreed that Data and Information are synonymous. The way you phrased your question, however, makes me realize that you are applying an industrial production metaphor to data/information usage. It's like raw materials entering an assembly line with finished product popping out the back.
But people and IT systems don't use data/information in this way. If you take a data element out of a database, crunch it in a spreadsheet, send it to colleagues for interpretation, and turn it into a powerpoint, this "information" is still data stored in a spreadsheet cell, email, or presentation chart. It is structured or unstructured information. From an asset and liability perspective, the values may change, and therefore we may qualify the asset with new metadata, but the way we can write policies to govern human usage of stored data/information is more sensitive to storage content and usage context than front-end description of it as data or information.
So, my personal view is that the data vs information distinction doesn't add any value to the challenge of governance. It's what is in the container and the intent of the user that are more important to Data Governance.
Question: "With content level agreements does data confidentiality have any role with the objectives?"
Answer: Yes. If the sensitivity of data has a higher business utility then an end-user is likely to pay more for it. The extra premium for the higher sensitivity would pay for the additional security needed to protect the data in the agreement. This is how you can get end-users to pay for, and appreciate fully, the value of data and security.
Question: "In point 2 How do we acess our situation. In benchmarking how can governance take decisions in a flunctuating legal environment, since an organization is affected by the global regulatory environment?"
Answer: Assessments are a snapshot of your organization in time. Don't let the snapshots get old and faded. Make self-assessment a normal part of every new business process, and re-assess yourself on macro topics on a regular basis. In this way, you can stay on top of ever changing global business regulations and requirements.
Question: "Do you see Data Governance as a process that creates a burden on existing resources, or an investment in the future? This may sound like a silly question, but a lot of organisations are reluctant to change and see Data Governance as an additional cost on people's time."
Answer: Every organization is already Governing Data:
A. They don't know it.B. They are doing it badly.
The burden is already there, uncounted. Count up how much its costing not knowing how to Govern Data effectively and you will make your business case for change. Bring the change on slowly, and integrate it into governance models already under way and you will achieve a higher comfort factor with your changes.
Question: "Is there a checklist available for DG self assessment to identify gaps and also for implementing them?"
Answer: The Data Governance Maturity Model provides that kind of self-assessment checklist and it is available to members of the Data Governance Council. Information is available on this website on how to become a Council members: www.ibm.com/itsolutions/datagovernance.
Question: "Sorry, but I missed the explanation for the contituents of the members at the data Governance Council and who's sponsoring it?"
Answer: IBM sponsors and runs the Data Governance Council, and membership information can be found on the website posted above.
Question: "Does this governance structure and process require full time staff to implement, monitor, and measure success? If so, how many FTE's would recommended for an organiztion of 10,000 employees?"
Answer: Yes. Many organizations today are investing in various Stewardship programs to provide full time staff to implement governing policies and monitor results. These are your organizational doers and while they can be part-time in early DG pilots to get your program off the ground, a DG program will require full time Stewards to be effective. Start small and grow fast. Let your stewardship number be proportional to the value they can create. Measure that value through data quality, process efficiency, and risk mitigation metrics. Report it often.
Question: "Please explain Value Creation with reference to data governance?"
Answer: Value Creation is several things:
A. A measure of the value created through the use and enhancement of data to your business bottom line.B. The yardstick of performance of your Data Governance program overall
We are Governing Data to create more value and we want to measure and report it on a frequent basis. I hear many people get caught up in complex qualitative measures of data, metadata, and value. Keep it simple. Measure productivity, labor saved, more efficient business processes, higher customer satisfaction, increases in revenue, reduced risk. These things are quantitatively measurable. If you don't know how to measure them, ask your CFO for guidance.
Question: "is Data Govenance in any way conected to corporate,enterprise and IT governance?"
Answer: Yes, Data and IT Governance describe new forms of corporate governance below the board level. All these governing bodies should use similar policy processes, have the same kinds of roles and responsibilities, and have well defined agenda and reporting rules with common charters that contain similar language. What you want is a system of governance in which the people may change but the powers remain the same. If you are creating governing structures that all have different charters, roles, and structures, you are creating complexity and your governance programs will fail.
Question: "IT Governance vs. Data Governance ... do you all consider this same thing?"
Answer: Same answer as per above. I don't consider them the same thing, but I do consider them different parts of a similar problem. IT is no longer a back-office function with no front-office dependencies. In many companies, IT is the front display window, the main method for interacting with customers, the brand a customer sees when they first contact the organization. Governing the human use of IT assets has become a central challenge in many organizations, and IT and Data Governance are different approaches to common challenges.
I would always tell my customers that Data Governance is a new way of thinking about old problems and strategically should be integrated alongside existing models, like ITIL, that already work and are understood. But it isn't one or the other. It's both.
Question: "Can you recommend tools that may be available in the market for data governance assessments / maturity modeling?"
Answer: IBM provides data governance consulting and assessment services and a wide range of software tools.
More information can be found here: http://www-306.ibm.com/software/tivoli/governance/servicemanagement/data-governance.html
Welcome to my Blog. I resisted writing a blog at IBM for many years. I have a short attention span, and just couldn't conceive that I would have anything interesting to write about for more than a few sentences a month. You may still not find what I write about to be that interesting, but as I get older my attention span seems to be growing so I am going to give this new medium a chance.
I have to begin my blog about Data Governance with a Short History of Data Governance at IBM.It started in April 2004.
Customers were talking about new requirements and trends in the marketplace that involved Security, Privacy, Compliance, Data Management, Policy, Audit, and Organizational Structures all at once. I saw large banks and credit card issuers nervous about increasing rates of internet fraud and identity theft in their merchant supply chains, facing millions of potential exposures across their merchant networks.
There were brokerages concerned about data transformation for offshore application testing, insurance companies aggregating all lines of business to create common customer records requiring new forms of data protection and access control. I heard confusion regarding global regulatory demands, cross-border data flows, institutional stovepipes and growing cybercrime and identity theft. Data has always been treated like a redundant and valueless commodity, but it must have a value if governments are regulating its use and criminals are trying to steal it. It must be both an asset and a liability, though I don't think anyone in 2004 fully understood what this means.
Through the spring, I visited many customers, and asked many questions. In June, I found myself in front of a CISO on Wall Street presenting the IBM Enterprise Privacy Architecture. I had about 30 charts prepared, and we had just updated EPA to version 2.0 so I was excited to share our latest ideas. At about the second chart, the CISO stopped me and said, "this is all very nice, but personal information is just one of our data types. We need a data architecture that embeds policy into business processes for all our data types. We want to discover the existing, unwritten policies that are part of our business culture that you'll learn about if you ride the elevators here for a day, as much as we want our employees to follow the new written policies we try to deploy every day. How do we do that?"
I didn't know the answer, but I did know that the stories and vignettes I was hearing pointed at new kinds of challenges that privacy, security, or data architectures alone would not solve. I went back to IBM to ask if we had any solutions for these problems, and heard from various camps that we had parts of these items covered. Peering beneath the assurances, I still saw gaps in understanding and capability, and volunteered to host an event to bring the two sides together - customers with new problems that had no name, and IBMers with old solutions that didn't fully apply.
In July 2004, I announced an IBM Security & Privacy Leadership Forum on Data Governance to be hosted at the Mohonk Mountain House in New Paltz, NY on October 6-8. The date was ideal, just on time for autumn foliage. The agenda involved three days of interactive dialog with customers on hard issues with no easy solutions. I didn't want a marketing and sales event. I wanted a Socratic dialog, an exploration of challenges, to define the outlines of a new marketplace. I wanted to bring IBMers and customers together in the same room exploring challenges, building a new partnership with our customers, and lead IBM in an outside-in design movement to align our capabilities to real customer requirements. Mohonk is a special location.
Built in 1878, the old Catskills resort sits at the lip of lake carved out of a crater at the top of a mountain amidst 5000 acres of wilderness. There are no TV's in the rooms, bad cell phone reception, and at that time no internet connectivity. The air is sweet, the views breathtaking, and the atmosphere quickly removes you from the reality of your work environment. It was the perfect location, and 120 people attended the 3-day event, including 60 representatives from companies across the world, 20 business partners, and 40 IBMers.
The original agenda file is too large to post, so I am posting some of the presentations instead.We divided the agenda into three workshops on Policy, Content, and Infrastructure. On the first day, we invited Customers to describe their challenges in these categories. On the second and third days, we broke up into workshops and explored the issues. The Infrastructure discussions were dominated by business partners and technical issues. The challenges and solutions were the most concrete and least revelatory. The Policy discussions started well but quickly devolved into a debate about whether policies were made of rules or rules were made of policies. There was no resolution, and yet there was also no letting up in the debate. It raged for two days, and continued in email thereafter. The Content breakout took place in an attic conference room with bad acoustics. But the dialog was fantastic. We explored new data architectures, challenges with legacy document formats, storage, archiving, discovery, and reporting. The discussions were interactive and intellectual, and all the participants came home with new ideas and insights.
In the end, I think we had the wrong categories on the agenda. I had asked all the discussion moderators to bring issues and questions on their charts, and many of them did but many of them also didn't. You can't really control every aspect of a presentation, or an event. But it didn't matter, because the most valuable part of the entire event was social interaction of customers from many industries and geographies recognizing that their problems were not unique, that they were connected in a common fabric that demonstrated new market requirements that every IBMer in the room heard loud and clear. And everyone accepted that the new name for these challenges was Data Governance.
Following the event, I received many letters of thanks from customers, partners, and IBMers. We clearly connected with many, and some customers asked if we could continue the dialog. They recognized that Data Governance was a common challenge that would require much more discussion and understanding. With that invitation, I formed the IBM Data Governance Council in November of 2004 as committee of industry peers dedicated to exploring common challenges and solutions in Data Governance. Key Bank, Merrill Lynch, Danske Bank, Bell Canada, and Deutsche Bank were the first members.
By the time we had our first Council meeting at the Ritz-Carlton Amelia Island in February 2005, we had 20 members. Since then, the Council has grown to about 55 members, and we've explored many issues and achieved many milestones. We meet four times a year, normally three times in North America and once in Europe. There are some participants who have attended every meeting. The meetings last 2 days, and over the course of the last three years we have built enormous social capital and trust in the Council that has enabled us to collaborate cross organizational stovepipes, among and between competitors, in ways that set an example of good governance.
In my next blogs, I will describe the road we took, the meeting contents, deliverables, and lessons learned.[Read More]