It is that simple.
You have supply chains that deliver toys from manufacturers in China to sit under Christmas trees in Canada, oil and gas from Russia to factories and homes in Germany, Diamonds from mines in Namibia to jewelers in New York.
Real World supply chains keep the global Industrial Economy running.
Alongside, you have Information Supply Chains that deliver crop yields to traders on the Chicago Mercantile exchange, raw video footage from journalists in Afghanistan to news desks London, Paris, and Atlanta, and sales performance reports from branch offices in Omaha to main offices in Arkansas.
Around the world, Information Supply Chains drive the Knowledge Economy.
They need to be Smart - Instrumented, Monitored, Measured, and Coordinated. And we need to be aware of when they are designed, what flows through them, and how we can improve them.
Without awareness, Governance itself can never be very Smart. It is that simple.
Amazon has some Information Governance problems.
A week ago, I placed a large order of Nerf Guns that Amazon keeps refusing to process. My kids love these things and I guess some adults I know kind of like them too. We're all heading out to my sister's house in Point Reyes for Christmas this year and a combined Family Reunion. Both my sisters will be there with 7 kids in a medium-sized house for four days and the best we could all come up with to keep them occupied was felt-warfare among the tall grasses of the Inverness wetlands.
If only Amazon would cooperate.
I have no desire to carry ten Nerf weapons on trans-continental jets. I can see explaining to turgid DHS officials why a family of four needs automatic-nerf canons with heat-seeking velcro missiles. So, I prefer to order them online and let Fedex make the arms shipments discretely.
But my order is stuck in Amazon credit card limbo. It seems that the last time I bought something and shipped it to my sister instead of my home address I used a credit card which expired in May. Problem is, Amazon somehow associates that credit card with my sister's mailing address. I've deleted it in my online account, and I buy things from them all the time with the current card, but Amazon hasn't purged this relationship.
From an Information Governance perspective, what kind of problem is this? It is of course a Data Quality issue, but normal DQ tools might have a hard time with rules matching in this case. My gut is that Amazon just doesn't sweep and purge their accounts for outdated credit cards. Its pretty frustrating as a consumer, especially during these busy days. Some records management would solve that problem, but by now the point is moot for me. I just don't have the time or patience to bother fixing their sloppy Information Governance issues.
Fortunately, Walmart sells Nerf Guns too...
On February 27-29, I hosted the 15th meeting of the Data Governance Council at the Wales Hotel, in New York City. 31 people registered to attend this meeting, including 16 IBMers, and representatives from JPMC, Bank of Tokyo/Mitsubishi, Bank of Montreal, Key Bank, State Street, MasterCard, and American Express, OpenPages, Axentis, Varonis, and Vericept.
On the first day, we had excellent keynote presentations from Garrick Utley, President of the Levin Institute, and Will Pelgrin, Director of the NYS Cybercrime Taskforce. We also had some good roundtable discussions on common challenges in Data Governance related to Sub-prime, Basel II, and other issues. On the second day, we continued discussing common challenges and reviewed IBM Data Governance Solutions with regards to Policy and Process Management, Data Modeling and Development, MDM, Metadata and Data Quality Management. On the last day, we left the agenda and had a long discussion on the future of the Council. Cal Braunstein rounded out the event with an excellent closing keynote on the risks to and from Data, and the risks to organizations from data we can't trust.
We spent a lot of time talking about Globalization and it's effects on competition, regulation, cybercrime, and risk. Globalization is having a corrosive effect on trust in many organizations. Pressure from regulations requiring oversight and reporting of employee use of IT increases distrust at all levels. Cybercrime and the increasing financial value of data challenges everyone with offers and scams that make it hard to trust information. These factors are creating internal crises in trust and confidence. The manipulation and monitoring of information technology by people over other people threatens the quality and value of decision-making at a time when global competition brutally punishes bad decisions.
The Globalization of threats, risk, regulation, and competition will immediately force organizational decision-making inward, towards hierarchical models of decision-making, even as the globalization of markets, labor and resource allocation forces more horizontal changes in culture, lifestyle, and freedom.
This Council has existed for three years, and many members, by virtue of their participation, have achieved more mature levels of Data Governance. They have cross organizational governance models, better transparency and better decision-making. Many newer members are just now exploring organizational models, business vs IT participation, the nature of Stewardship and the complexities of overcoming organizational stovepipes.
Enclosed are my notes and observations from this landmark meeting:
1. Data Governance Market Maturity: Data Governance as a market is maturing from the Innovator phase, where a few leading companies worked together to blaze a trail for others to follow, to the early adopter phase. We are clearly seeing some leading companies succeed with Data Governance, thanks in part to the Data Governance Maturity Model, and many many more now coming into this market looking to build on the success and experience of the innovators.
For those of us pioneers, this is a time of change, and we must adapt to a new market constituency requiring education and solutions with somewhat less tolerance for discovery and invention. The Data Governance Starter's Guide should be updated as an educational onboarding tutorial for new companies seeking Data Governance success. For vendors, this is a time to study solution packaging and focus on the support needs of the stewardship community. Stewardship is a profession still in its infancy, and it requires practitioner tools, education, and community forums to exchange practices and success stories.
We should all be proud that our contributions have move the market to this new phase, and the Council needs to change to grow with the Market.
2. IBM Data Governance Solutions: IBM has come a long way in its Data Governance Solution capabilities since 2006, which was the last time we had a major showcase of technologies on the Council Agenda. Most of our solutions - Compliance Warehouse, Integrated Data Management, MDM and Industry Models, Data Quality and Metadata tools - were very well received. But this Council has succeeded exactly because it is not a normal IBM Customer Advisory Board, where normal meetings are dominated by IBM solution exhibitions. Rather, it has succeeded as a unique forum for practitioner exchanges, and it must remain this way to continue.
Future meetings will be shorter, practitioner driven, and IBM will find additional venues to present Data Governance solutions.
3. Globalization: At Mohonk in 2004, at the inaugural Data Governance Summit, I presented some ideas about how information technology would transform the modern corporation, and how integral Data Governance would be to that process. I was heavily influenced by Tom Malone and his book the Future of Work, and also by the history of industrial regulation at the dawn of the 20th Century.
In NY, we re-examined some of these topics through presentations from Garrick Utley, Will Pelgrin, and Cal Braunstein, and I think we need to continue examining how the global pressures on information technology, regulation, cybercrime, risk, and transparency will impact Data Governance and organizational behavior. Many companies that have embraced Data Governance have stopped short of embracing x-organizational governance bodies with real authority. Most don't know which models to follow, examples of success to emulate, how it should work.
In my travels I've seen many governance models in corporate and national entities that offer some hope to modern organizations, and I think we ought to be the Council that inventories these models, compares their pros and cons, and presents alternatives to hierarchical organization.
4. Data Risk Standards: In the Xiao Dynasty in China, rulers practiced Risk-based decision making by consulting an Oracle, who dropped an Oxen hip bone on the floor and deciphered the direction of the crack in the bone as indicative of divine truth. If the crack pointed up, you had good favor for your decision, down, well you better ask again. People consulted the Oracle on every kind of decision - dental surgery, marital options, taxation, or war - and they would drop 6-9 ox bones and average the results, thinking that more data would provide more accurate results. Every question to the Oracle was journalized, and outcomes were constantly compared to the ox-bone forecasts. Records of these inquiries survive today, providing the oldest known risk forecasting models. Three thousand years ago, this was the first form of risk-based decision making, and while it may seem primitive to us it was at least systemic which is more than we can say about ERM practices today.
Enterprise Risk Management today is still a voodoo art practiced by a secret society of Risk Managers in a language few understand. It is expensive, bespoke, non-standard, and under-utilized. Market, Credit, or Operational Risk consequences are not understood by the vast majority of employees who make enterprise decisions because none of them have access to even Oxen bones today, let alone risk-based forecasting models that allow decision makers to compare options, forecast outcomes, and compare results to the forecasts.
To get to that state, where ERM is a common discipline that every employee can use for enlightened decision-making, new Data Risk standards are needed, to make ERM simpler, cheaper, and more systemically repeatable, and that is another contribution this Council can make. We will next meet on June 26th at the Federal Reserve in Washington, DC to explore that opportunity in depth.
What was evident at this meeting is that Data Governance challenges have changed in three years. We are still at the cusp of changes in the way modern, post-industrial, organizations are governed. Even the most mature members of the Data Governance Council have not substantially changed the way their organizations perform decision-making. It is still top-down, barely delegated, with little or no trust extending from the top to the bottom of an organization. Many governance bodies or teams have little or no direct decision-making authority - neither funding mandates nor project veto powers. The light of information still shines brightest from the bottom-up, with those at the top getting the best view of the light and those at the bottom simply blinded by it.
We need new models of organizational governance, new data standards in ERM, and renewed investment in risk-based decision making at all enterprise levels. This remains the challenge of Data Governance in the early adopter market evolution.[Read More
On February 26-27, I hosted an XBRL Risk Taxonomy Forum in NY at The Levin Institute in which we explored the concepts of operational, market, and credit risk. Through interactive discussions, we looked at how those concepts could be articulated in an XBRL Taxonomy and what benefits regulatory authorities and market participants could derive from new key risk indicator monitoring. We looked at the ORX example of Operational Risk loss event reporting and saw how 50+ existing banks are sharing operational loss data to better trend individual losses and learn x-industry loss patterns.
And on the last day, we explored positional reporting as a key risk indicator of market crowding and bubble formation. One outcome of the meeting was a call for a followup meeting to review the ORX example in greater depth and explore both existing risk reports and sources of positional data.
On April 23, we will meet again at the Levin Institute to focus more deeply on the ORX data model, an examination of existing regulatory reporting, and positional reporting options from Swift and DTCC.
The work will be done in English – no XML – to make it easy for everyone to participate. Our goal is to answer some fundamental questions:
1. Is the ORX data model sufficient for Operational Risk reporting on a national level?
2. What is the right business model for Operational Risk reporting and who should maintain the taxonomy?
3. What kinds of key risk indicator data are already collected by financial regulators that are either not used on a systemic basis or not shared across the government?
4. What is the most efficient method for collecting end of day/week positional data?
- from market participants directly?
- via clearing and settlement firms?
5. What should be the role of a semantic repository in the construction of risk reporting taxonomies?
6. How should the regulatory authorities build and maintain regulatory taxonomies?
7. How should the world maintain semantic consistency between many regulatory taxonomies?
8. What should a 21st Century Regulatory Information Architecture look like?
We can't possibly answer all of these questions in one day, but we can begin an informed dialog and encourage global participation - No one else is addressing these issues and I think we can make a difference doing so.
I look forward to seeing you on April 23rd.https://www.ibm.com/developerworks/blogs/resources/adler/IBM%20Data%20Governance%20Risk%20Taxonomy%20Meeting.pdf
Data=Information=Knowledge. Or so we would like to say. In theory, data is unorganized information, and knowledge is information put to use by human beings. But theories are for academics. And this theory is super convenient if semantic consistency is important. There are Data Architects who only think about data in databases, Information and Content Architects who only work with unstructured repositories, and even Knowledge Architects who I suppose work with information taken out of human brains and put into... structured or unstructured repositories on computers...
In real life, in real companies, these are artificial distinctions. Organizations want to control data/information supply chains because they are full of quality control problems, security vulnerabilities, compliance challenges, and operational exposures. Those risks imperil decision-making, increase operational costs, and reduce revenue opportunities. Quality control and risk mitigation are challenges for every data type.
Five years ago, "Data Governance" seemed like a great name for a new discipline to help transform organizational behavior from vertical to horizontal; because information is transformational. What we meant then and mean now is not just about "Data" in the purest structured sense. We mean Data in the most plural and unlimited sense. People want to govern other people's use of all kinds of information in every form.
No data stovepipes please! We need Data Governance Solutions for all human uses of information regardless of their form or structure, use or abuse.
Anyone who tells you different is just so 20th Century...
Two years ago, I met Helmut Willke, the author of Smart Governance: Governing the Global Knowledge Society, at a hotel cafe near the great cathedral of Cologne. Professor
Willke is a sociologist who teaches Global Governance at the Zeppelin
University in Friedrichshafen, Germany. Late in 2009 I became
interested in Governance as a system of decision-making and Professor
Willke had written an excellent book exploring this topic. While the
Professor is German, he writes extremely well in English and his book
very well written and insightful. Like a lot of philosophical texts, it
is not an easy read. Dense descriptions, long sentences, and theory
backed by ample example make it a book you have to read at least twice
to fully comprehend.
I was in Cologne in late February 2010 to meet the CIO of the City and attend Rosenmontag at City Hall
. I had already seen several days of Karnival, with the endless parades, costumes
and candy strewn about the streets. For five or six days in February,
the staid and reserved city of Cologne becomes an endless drunken party
attracting visitors from all over the world who wear outrageous costumes
and march in parades on incredible floats and throw candy to the
bystanders. Its unlike any parade I have ever seen. Quite amazing.
It had snowed a lot that year. It was white from Brussels to Berlin,
and Cologne was still covered by eight inches. The square in front of
the Dom was clear, and I had spent the morning before our meeting
visiting the Roman museum across the square. Cologne is an ancient
Roman city and the ruins are collected in a fantastic museum right next
to the Dom. Of course there are columns and pediments, but also beautiful mosaic floors, jewellery, stained glass,
and decorative arts. There is a model of the Roman city and you can
see how the Germans built the city on the same street grid with walls
built on top of the Roman walls. Of course, much of this was destroyed
by allied bombs in WWII, but some remnants remain.
Looking back at Roman colonial rule of Cologne was an excellent
introduction to the systemic ideas of Governance Professor Willke and I
discussed over coffee that afternoon. He is not a tall man, mostly grey
late-50′s I would say, with bright blue eyes. He makes an immediate
impression, and is passionate about his book. I had used the book as
text for a class I taught at the Bucerius Law School on Data Governance
in Hamburg that January. My students did not entirely appreciate the
dense prose and abstract ideas, but through class conversation we did
ultimately appreciate the idea that Governance is a system of
decision-making that could be described and modelled. And we used
Social Networking metaphors to explore the idea of policy-making, human
behaviours in a system of Governance, and how to model potential
outcomes. Of course there is political science, which describes
political models of Governance – Democracy, Dictatorship, Monarchy, etc –
but what is unique and important about Professor Willke’s book is the
application of systems theory to Governance.
We had some coffee and talked mostly about how the Professor wrote
the book and why. As I had in 2007-8, the Professor had used the Global
Credit Crisis as a use case to describe failures in Governance. I had
covered this topic from a Data Governance perspective, arguing that
hundreds of incremental failures in business processes and data quality
had produced a domino effect that plunged the global economy into
Depression. He covered the topic from a decision-making perspective,
and while we approached this topic from different directions we arrived
at similar conclusions – policy-makers can’t possibly make the best
decisions without understanding the consequences of those decisions on
incredibly complex and interconnected global systems. And those
consequences are impossible to understand without new information
systems that render the complexity with software and illustrate how the
policies will be accepted and resisted.
In my class at Bucerius, my students complained that the Professor
had not done enough to provide solutions to the problems he had
identified, or that his solutions were too abstract. I presented these
criticisms to him at our meeting and he responded that it was not
possible to offer concrete solutions because every systemic problem
needs to be modelled to understand the variables and outcomes – that
there is no one size fits all. At the time, I thought this was a
dodge. It took me a few more years to understand that he was right.
There are no Governance Solutions that can auto-magically produce the
best outcomes for every decision. But it is possible for policy-makers
to use systems theory and software to construct decision-making models
that can plot many of the actors, objects, variables, and potential
outcomes to understand the impact of policies on complex systems made up
of hundreds, thousands, and even millions of human beings with unique
After my course, I synthesised concepts from the book with ideas from my students to create the Six Steps to Smart Governance.
It’s not meant to be a Framework. Frameworks and models are nice tools
to help people feel more secure about challenges they seek to overcome,
but they are not useful in making better decisions. The Six Steps are
meant to be a structure for decision-making that one would apply
iteratively; in which each of the six steps would involve different data
points and variables. Of course, it is highly summarised, flavoured
with marketing. And I would say in hindsight, its not really useful as a
practical or operational tool. It’s really just a theory, a
simplification of the better documented ideas Professor Willke writes
about in his book.
And I think we can do better. In the IBM Data Governance Council we
will soon begin to explore dynamic simulation models that go far beyond
the Six Steps to Smart Governance, and I recommend reading both the white paper and Professor Willke’s book:
Smart Governance: Governing the Global Knowledge Society
Today, thanks to really powerful simulation software, we can create
dynamic models that help demonstrate the impact of policy on people,
processes, and technology. The Data Governance Simulation Project will
revolutionise the field of Data Governance by applying theory, software,
and observed practices to an interactive model that will yield powerful
insights into Data Governance Value Creation and Risk Mitigation.
A lot of people ask me, “how do I show the value of metadata?” Some
say, “how do I make the business case for Data Governance?” Consultants
and Gurus will have a framework or process to offer you, a get started
guide with use-case examples, graphics, and legends about their
successes. But these myths won’t help you, because your challenges are
unique. Your politics are special, and your people are not machines.
Best practices are useful examples of glorified solutions that are very
hard to replicate outside the lab. And as many are already finding out,
people resist policies they don’t think apply to them and its really
tricky to understand how to change organisational behaviours on an
on-going basis without policies that dynamically change with new
Data Governance is, by nature, a systemic challenge and you can’t
solve systemic problems without systemic solutions. Projects and teams
that expect quick hits and 90-results are the reason you have systemic
Data Governance problems in the first place. But it is possible to
create software models that allow you to plot the goals, metrics,
policies, communications, outcomes, variables, and modifiers and
evaluate the impact of new policies and controls on your environment.
And that’s the lesson of Smart Governance: you can model complex
environments through Simulation and make better decisions. To learn
more about using Simulations to make better decisions, take a look at
the IBM Smarter Cities Demo.
In that demo, the complex interactions of human beings living in a city
are compared to the goals of human policies, the metrics measured by
interactions, and potential outcomes.
Many of our organisations are as complex as small cities. Policy and
Politics share the same ancient Greek root word – epolis. epolis is a
city, which itself is an aggregation of human beings who require
Governance to arbitrate their diverse interests and achieve better
outcomes for all. Today, we can simulate those interactions and help
Policy makers profile the impact of their policies before they are
deployed. Its a kind of Visual Risk Calculation.
If you would like to participate in the Data Governance Simulation
project, please read the Six Steps to Smart Governance White Paper, the book
by Professor Willke, and join the IBM Data Governance Council by executing this membership agreement.
Only members of the Council will be able to participate in this
exercise and you don’t want to miss this because it will fundamentally
change Data Governance.
Please join us for an international crowdsourcing experience!
In May 2006, the IBM Data Governance Council used poster board and sticky notes in an oak paneled room in the Chateau Frontenac in Quebec City to create the categories, elements, and levels in the first version of the Maturity Model. About 35 people
participated in that process in Quebec, and perhaps another 50 more in subsequence meetings.
On September 14-16 2010, the Council will use social networking crowdsourcing technology to include a global community in a discussion about the Maturity Model - Live!
Suggestions and comments from practitioners all around the world will be relayed to the participants in the room.
Of course, this venue is awesome, and there is no substitute for live, face to face, communication. But if you can't travel to Tamaya, and spend three fabulous days with The Council in the Desert, you can still tune into the action by going to infogovcommunity.com.
In the room or in Rangoon, you can watch the ideas flow and chime in live or tune in later and add your views.
Either way, what you contribute will impact the community and change the Maturity Model. Synchronous or Asynchronous, this meeting is the beginning of a global dialog on Data Governance Maturity.
What we do in the room will make a difference. And what you contribute from your own room will make a difference.
Please join us in Tamaya or online at www.infogovcommunity.com to capture the best ideas from the Global Information Governance Community, contributed for the Community and published in an open-sourced IBM Data Governance Council Maturity Model.
This is how we innovate!
Steven B. Adler
IBM Data Governance Council
Today, we the global Information Governance Community are announcing that we are publishing the Data Governance Council Maturity Model under an open source copyright (for non-commercial purposes) on a website called www.infogovcommunity.com.
The purpose of the website and the publication is to invite the world to participate in a crowdsourcing project to involve thousands of Information Governance practitioners from around the world and help the global community to update the Maturity Model and broaden the definition of Information Governance.
The site is powered by Chaordix, a fantastic company to work with. We've been working together in a two-month beta test of crowdsourcing in which the Council reviewed the site and submitted ideas each week which Chaordix took and implemented. What you see today is a product of Community interaction and technology.
Take a tour. On this site, you can interact with peers from around the world in the time and timezone most convenient to you. You can use the Maturity Model to self-assess your organization's capabilities, work on topics to define Best Practices, and establish your credentials as a leader in the growing international market known as Information Governance.
Check out the leaderboard, where the best and brightest can see how their ideas are recognized by the community, or the blog where longer ideas are published to inspire insight and discussion. Infogovcommunity.com brings together Information Governance and Social Networking to inspire innovation for the common good.
The site is brought to you buy IBM but supported by the Community for the Community with a self-funding subscription model. Starting on September 1st, Community members will pay $299/year for individual membership and $699/year for corporate membership to cover the yearly costs of maintaining the site.
On Tuesday, I gave a keynote presentation at SIMposium 2010 in Atlanta, Georgia. It was on the last day of a conference at 8:15am. On the best of days, I'm not a great morning person. The last day of a conference is not normally the best of days for a presentation. Normally, at least half the participants are in taxis on the way to the airport and the other half are often exhausted from the content and discussions on the earlier days. When I was first asked to speak, I was not inclined to do it. Keynote or not, 8:15 on the last day felt like a bad proposition.
I could not have been more wrong. First, the room, and it was a huge ballroom, was full with about 300 people. Second, they were awake, animated, and fantastic to talk to. We had a great conversation together, and I completely enjoyed the interaction.
Third, they were not the normal Data Governance crowd. In fact, when I asked how many had Data Governance programs at the start of my presentation not one hand went up. This is the kind of group I love talking to and they are the ones we most need to reach.
SIMposium, thank you for an excellent experience. Many have since requested my presentation and here it is in Flash format. Just click on the link below and it will launch in your browser.
SIMposium 2010: Change is Not Just a Word
This morning, General Motors announced that it would no longer advertise its cars on Facebook. This announcement comes a day before the Facebook IPO, and casts a shadow on the business model of Facebook. GM said that they will continue to support their page and user community on Facebook, but that ads just weren't effective in helping consumers to make car buying decisions. Ford jumped on this announcement to say they would continue to buy ads on Facebook and that Social Media requires a consistent commitment to innovation and community development.
Maybe. But I think GM's decisions does illustrate a key problem for Facebook and Twitter - the revenue model. Social Media grew up without dependencies on ad-based revenue. On Facebook, you aren't a customer. You are a product, and its your likes, dislikes, friends, photos, videos, and content that generate value. Selling products to products via advertising is hard. Members don't use Social Media to go shopping. There's no commerce platform there. They use it to be social. There are so many other outlets that are more effective for advertising than Social Media.
So how should Facebook and Twitter make money? My idea: make it collective. The value is in the data.
1. Make terms and conditions explicit that every member owns their own data via copyright. This does two positive things.
A. It indemnifies Facebook and Twitter for the crazy, infringing, and potentially libelous posts of their members by allowing them to claim that they are conduits of content rather than publishers or distributors.
B. Copyright establishes the rights to royalties for content created and posted on their networks, which enables the next step.
2. Allow members to opt-in to Big Data analysis by Social Media partners and intermediaries.
3. Charge Social Media for Big Data Searches by data volume.
4. Pay members royalties every time their data is used in Big Data Searches.
This simple model creates powerful incentives that transform user members from products into mutual social network content providers with an economic interest in posting content that will be used in Big Data searches. It establishes data property rights that insulate Facebook and Twitter from vouching for the content on their networks. Members will also discover that providing high quality data that companies want to search for means more royalties and so the system will produce better behaviors. And it creates a 2-tier royalty distribution model that will also pay Facebook and Twitter handsome revenue that will change online advertising and make every other content aggregater change too.
Of course, Facebook and Twitter will have to sort our who's a person and who's a bot, and will have to provide content creation tutorials to help users/customers create content that has value by sharing the top 100 Big Data queries and sample results.
But this Business Model has something for everyone and is a true win:win. It benefits customers by establishing data property rights and royalties for content. It benefits organizations who want to do Big Data searches by providing ever richer data streams of high quality and availability. And it benefits Facebook, Twitter, and their investors by providing an enormous profit making engine selling Data.
The Data is the Value. The more there is, the more valuable it becomes. Pay your customers to create higher quality data and charge your partners to use it. Its a simple Business Model.
Dick Costolo - @dickc - and Mark Zuckerberg - @finkd - are you listening?
Fog. I thought we were in the clouds as the plane wheels hit the ground like a fighter jet landing on a carrier deck. Visibility was maybe three feet and the fog was so dense the plane parked on the tarmac and we were brought to the terminals in bright yellow buses. Kastrup is so efficient. Clean walnut parquet greets you as you climb up floors to reach the neatly organized passport control, where kind border control guards actually smile when you arrive at the window. In JFK they growl at you and treat you like a criminal begging for mercy to enter a dingy airport that feels more like a mid-50's bowling alleyl. In Copenhagen, the baggage is at the carousel when you arrive and the airport feels like a luxury shopping arcade. Mercedes taxis whisk you into the city, on a sleepy sunday when most of the city is just having brunch.
My hotel room isn't ready when I arrive, but I'm happy to have some hours to relax in Vesterbro and wander the empty streets as the fog burns off into early autumn sun drenched splendor. The grass is green, the trees are yellow and red, the sky is bright blue. It takes me two hours to adjust and remember the life I led when I called this city home for 5 years in the 1990's. Bicycles wiz by on their own lanes next to the sidewalks. Late 19th Century apartment buildings hide hip modern interiors. Small, heavily taxed cars conceal a standard living that is the envy of most other nations. What a remarkable governance experiment. High personal income taxes (top rate is 52%), VAT (25%), car tax (220%), and all manner of other taxes are balanced by very low corporate tax rates (26%) and a free labor market, yielding universal healthcare, excellent pensions, and free education through PhD. This country is a net oil exporter, thanks to lucrative North Sea oil platforms, yet produces 60% of its energy needs from wind, solar, and geothermal.
While America watched its bridges and roads deteriorate, Denmark built huge public works projects extending road and rail bridges to Sweden, Germany, and from Jutland to Zealand. They unified their rail system in Copenhagen, and deployed high speed rail to Hamburg and Stockholm. They made Kastrup into the logistical hub of Scandinavia, linking the Nordic countries to the EU mainland. It is a remarkable little country, and this week the weather is also wonderful.
I'm here to speak at a conference - IBM Software Group Day. I'm in a Global Services track and have 35 minutes to go through some dense Data Governance content. The conference site is a mile from my hotel and I love the walk through Vesterbro, along many sleezy streets west of the Main Train Station that today feel quite a bit better than they were a decade ago when I lived nearby. The conference venue is an old slaughterhouse, now filled with 1200 IBM customers, and some fantastic art works on the walls. The conference organization is fantastic, and everything seems to run as efficiently as the rest of Denmark. My session is just after lunch, and my slides suffer some strange powerpoint virus which mixes them up just short of delivery. But the audience is wonderful and we had a great time going through the discussion. Somehow I finish on time, which is rare, and get some great questions after.
Enclosed is what I presented. Its similar to the SIMposium 2010 deck with two new use cases. They worked well in Copenhagen and I have plans for something even better at IOD: Copenhagen SWG Day Presentation
The rest of the week is full of customer meetings, but every day I'm here I'm reminded of the life I once lived in Denmark and the part of me that that lies dormant the rest of my life when I'm not here. Its the casualty of international travel, that you learn not only great things about the places you visit but also what you learn about yourself that is only evident when you are there again.
I'm writing this blog entry in my hotel room on the 14th floor of the Grand Hyatt in Jakarta, Indonesia. Traffic screams by the massive fountain circle outside in a constant torrent of horns. I've been here all of two days. Met a customer in town this morning, and yesterday we drove three hours to meet a customer in Western Java. I've seen rice patties, jungle, mountains, tea plantations, small villages and ways of life unchanged for centuries, glittering shopping malls with every brand available, fantastic office towers, and levels of luxury unembarrassed by poverty in every street. It is at once fascinatingly familiar and different at every corner.
This year, I've visited customers in Jakarta, Manila, Tampa, Columbus, Johannesburg, Dallas, Hamburg, Warsaw, San Francisco, New York, Brussels, and Cologne. And every where I go I hear the same stories, the same issues, the same needs.
Data Governance is a global market. Everyone is doing it.
Tomorrow I fly to Bangkok, where Red Shirts have held a government hostage for six weeks. On the edge of a knife, a nation split Red and Yellow, and I'm hosting a Data Governance Workshop for 2 dozen customers.
The market need is hotter than Red.
If your company doesn't have a program working today, it's a competitive disadvantage.
Don't wait. Just do it.
Frameworks freeze you in the past, by forcing you to interpret the present based on rigid formulas, interpretations, and even misconstructions. In 2007, the IBM Data Governance Council finished its Data Governance Maturity Model. Looking at all its imitations in the market, one could conclude that it has been remarkably successful.
However, as a benchmark of relative organizational maturity - and not just data management processes - I think its time has past and I'm working on new ideas.
Last week, I became a victim of toxic content. It can happen so fast, without warning. My sister, a trusted source, forwarded two photos that purported to show the Air France flight breaking in half before it fell from the sky into the Atlantic off the coast of Brazil. There was a caption that said the photos had been taken by a passenger, and while the camera had been destroyed in the crash the memory stick was recovered. Even the photographer's name had been discovered by tracing the serial number of the camera. One photo showed passengers with air masks on, a gaping hole in the mid section of the plane and the tail section falling away. The second photo showed a man being sucked out into the open hole.
They were immediately shocking photos, all the more so to me because two of my students from my Data Governance course at the Bucerius Law School died on that flight. Alexander Crolow and Julia Schmidt were two bright young students from Germany and Brazil who had traveled to Brazil to tell Julia's parents of their plan to marry and were returning to Germany that night to tell Alex's parents. An event like the Air France crash it transformative when you know someone who was on it.
But alas, the photos were fake. They were taken from the TV Show lost and sent around the world in an email. Bolivian TV even showed them on the air before they discovered the fakery. But by then the damage had been done. For so many people around the world wondering how their loved-ones perished in that plane, the photos offered chilling illustration. We should have recognized the forgery at the outset since the plane crashed at night and the photos showed bright daylight through the hole. But critical thinking disappears quickly when you are emotionally involved. And of course on the internet any trusted source can inadvertedly be a conduit for toxic content. Thus knowing the source of your content is not enough to establish trusted information. You need to verify by corroborating the content with another source to establish veracity.
In the 21st Century everyone has to be a journalist.
I've written in the past about the loan origination underwriting failures that are at the heart of the current credit crisis. Market failures in Mortgage Backed Securities, Collateral Debt Obligations, and Credit Default-Swaps can all trace their lineage to high default and foreclosure rates resulting from those underwriting failures. In a piece I wrote in early 2008, I argued that simple changes in underwriting standards could have prevented the market meltdown.
I've also written about the relative efficiency of the Danish Mortgage Model and yesterday I heard an in-depth comparative presentation on that Model that I have to relate because it totally changed my point of view on the Danish Model. Up to know, I had seen the Danish Model as a business platform for mortgage processing. What I saw yesterday is a consumer solution with enormous political appeal.
The meeting was at the American Enterprise Institute in Washington, DC and the speaker was Alan Boyce, CEO of Absalon, the organization that exported the Danish Mortgage Model to Mexico. Alan presented the Danish Model in the context of what the Danes call "The Principle of Balance."
The Principal of Balance
enables borrowers to refinance their mortgages when housing prices go up AND sell their mortgage bonds at current market prices when housing prices go down to preserve their equity. In the United States, borrowers can refinance when rates decline and housing prices rise, but they have to suffer negative equity when housing prices decline. Housing prices often decline in a recession, and negative equity restrains labor mobility by nailing home-owners to their existing homes until prices rise and they can sell without a loss.
In Denmark, when recessions hit and housing prices fall, borrowers can sell their straight securitized bonds in a secondary bond market and refinance their mortgage at the current market price for their home. This flexibility protects consumers from negative equity and empowers workers with greater labor mobility.
From Alan's charts, Here is how the current system in the US works:
If Interest Rates decline:
- Home prices go up
- Homeowner can prepay existing mortgage
by refinancing at new lower rate
- Allows for equity withdrawal
If Interest Rates go up:
- Home prices go down
- Value of the mortgage (in a MBS) drops to
the holder of the mortgage
- Even though the value of the mortgage has
dropped, the homeowner still owes “par” –
the face value of the mortgage. He cannot
prepay existing mortgage at the price the
mortgage is selling for in the market
- ~$5 trillion is currently owed by
homeowners of non-agency mortgages.
These mortgages are valued by the market
at $3.5 trillion.
- In some of the hardest hit regions in the country home owners have lost their jobs and have negative equity in their homes, and they can't do anything about it.
Using the Principle of Balance
, here is how it would work:
If Interest rates decline:
- The system operates the same
- Home prices increase and people can refinance and take equity out
If interest rates increase:
I think this chart summarizes it best:
- Home prices go down
- Assuming credit worthiness, a homeowner
can prepay by purchasing back his or her
mortgage at the current discounted price
- This maintains equity in the home
- The key is new, standardized mortgage
This model doesn't perfectly preserve home equity as home owners will suffer some loss when housing prices decline, but the loss is substantially mitigated and this system offers individual freedom and choice. It is actually far more market oriented than the current US model.
In the US, we currently suffer 10% default and foreclosure rates, and there are an additional 15-20% who suffer negative equity in their homes but are not at risk of foreclosure. People in foreclosure can't take advantage of a new Principle of Balance Mortgage system, but the government can offer programs to restructure their mortgages at market value. Those with negative equity could be encouraged to migrate to a new Principle of Balance mortgage model.
This is an idea that has enormous benefits all around. It can help the Obama Administration reprice existing toxic assets. It can help provide more market-flexibility to home-owners. And it can repair confidence in the American mortgage market among investors world wide.
Who would have thought that market-oriented reforms would come from such a "socialistic" country like Denmark!?
I encourage everyone to read Alan Boyce's presentations and white papers. It is one of the most intelligent and easy to implement regulatory reforms I have seen in many years.
His full presentation: https://www.ibm.com/developerworks/blogs/resources/adler/20090325_1.pdf
His short white paper: https://www.ibm.com/developerworks/blogs/resources/adler/20090325_3.pdf
Over the years, the IBM Data Governance Council has had many international meetings:
- 2005 - Kronborg Castle, Helsingoer Denmark
- 2006 - Chateau Frontenac, Quebec City, Canada
- - Bucerius Law School, Hamburg, Germany
- - Hotel de Ville, Paris, France
- 2007 - Isola di Giorgio Maggiore, Venice, Italy
- 2008 - Kuala Lumpur, Malaysia
- - Zappieon Palace, Athens, Greece
- 2009 - L'Hermitage, Franschoek, South Africa
On January 21-22, 2010, the IBM Data Governance Council will be starting a chapter in Poland by meeting in Warsaw.
Around the world, Data Governance is in hot demand.
Data Governance Programs are popping up all over the globe. It isn't hard to get one started anymore. But it is hard to be good at it and to make it last. In fact, I see more programs taking one step forward and two steps back – narrowing focus to demonstrate results – to fall in line with other IT projects than chart a clear path towards larger transformation.
But lets be clear – Data Governance is about Business Transformation. We can't change organizational behavior to take data seriously if we can't change how we work.
We in the Data Governance Council have a vision that Data Governance is a coordination of people collaborating on common goals and purposes – to use data as an asset. That vision requires that piecemeal project management of data issues must evolve into systemic governance structures and methods, whose goals and purposes themselves transcend the people, applications, and interactions.
Until last year, we didn't fully know how to close the gap between where we are today and where we'd all like to go. But today we see the way forward, and the Data Governance Council is embarking on a bold new program to develop Predictive Governance: systemic ways of describing our world and modeling potential interactions to understand what works and how to improve it.
Traditional scientific analysis says that to understand a problem you have to take apart the issue and decompose it into all its components and sub-components and find the root cause.
But this assumes there is always just one root cause and one thing to blame:
“Data Quality in our branch operations is atrocious, so we have to fix our incentive structure.”
“Our network was hacked and our customer data was exposed, so fire the CISO.”
Its almost irresistible to search for scapegoats to common problems using simple cause and effect analysis.
People rarely ever imagine that
Individual data quality problems are symptomatic of larger systemic challenges in the information supply chains we have created over decades to handle information flows from source to target;
and no CEO expects that network hacks are the result of systemic weaknesses in IT systems that are themselves a reflection of organizational culture and priorities.
Its hard to accept that people created the systems that enable Poor Data Quality, Global Jurisdictional Jungles, Metadata misunderstanding, Lax Security, Privacy Invasions, and Big Data Mischief. No one deliberately creates these problems. No one wants them to continue. But they do continue nonetheless because people really don't understand the elements and interdependencies of the systems they have created.
The point of Predictive Governance is that we work in large ecosystems and we must work to understand them. If we can't describe our ecosystems, we can't rise above the superstitions and organizational behaviors that constantly hold us back.
This event will explore the ideas and methods behind Predictive Governance, new Enterprise Data Governance Solutions that integrate multiple business and IT domains, and Internet Jurisdiction and Multi-Stakeholder Governance in the context of global regulatory confusion as an archetype of Predictive Governance Challenges.
These are big problems and we are working on big solutions.
See the agenda. Read our blogs. Understand our mission. Be prepared to interact.
This is a thought leadership forum for change. Join us and make a difference.
This event is open to all who wish to join the IBM Data Governance Council. Register to attend here: http://dgcouncil.eventbrite.com/
This morning, EU Regulators announced that they propose to create a Risk Board to monitor financial market performance and systemic risk indicators among the 27 member nations in the European Union. I've advocated a Council approach to risk-based decision-making since the beginning of this year and I think the EU proposal is a good idea in concept. Unfortunately, in Europe it seems decision-making takes a large number of people, becaue the European proposal would have 63 people participating on the Risk Board. A deliberative body with 63 people is not a "Board" - it is a legislature. To complicate matters, "only" 32 members of this board would have voting rights. Unfortunately, the only power they can vote on is a warning to member states that some part of their market performance contains systemic risk. How they plan to determine that threat and get everyone to agree on what it means in any reasonable amount of time is not clear. My guess is that this is a proposal to setup an intra-governmental think-tank that will study issues, write economic reports that no one reads, and only threaten to issue warnings because a vote on a warning will never happen.
Note to Obama Administration: If you want to create a Systemic Risk Regulatory Structure that is guaranteed to fail due to political indecision and lack of authority, copy the EU model.
is an ancient Spanish colonial city with American influences and a culture all its own on the rim of Asia. It takes several visits to appreciate that despite appearances and a host of American shops, businesses, and call centers, Manila is not a larger Honolulu, and the Philippine people are not just nicer Hawaiians. The culture, like the heat, is soft and pervasive and gently unique. The foreign influences, like the rain during the early June rainy season, hide behind clouds.
Two weeks ago I made my third trip to Manila, and hosted a Data Governance Council Maturity Model workshop in a modern hotel conference room for 25 customers spread across 10 tables of round. In my 8 hour presentation, I integrated the Maturity Model into the Six Steps to Smart Governance using both OpenOffice and the IBM Application Roadmap Tool (ART). Customers used laptops with the ART tool running to score their respective levels of maturity and I explained how the Maturity Model provides benchmarks to assess current and desired states of Maturity from which the Six Steps can be used to govern the use of data in a more scientific and repeatable way.
I've given these two presentations often, mostly in shorter conference presentations, but at least 12 times a year if not more. I constantly update my presentation with current examples and anecdotes to keep the material fresh but also to keep myself fresh and avoid the self-boredom of redundancy. But to each new audience, the material is fresh and I'm always amazed at how the Maturity Model transforms conversations from abstract theory to relevant practice.
I present five to seven charts then go to the ART tool and we run through three to six sub-categories of the model. Organizational Structures/Summary, Data Quality/Processes, Stewardship/Accountability, Risk Management/Accountability. During these phases I read the content for each level of Maturity and simulate a to-be and desired state by moving the slider bars over. Most of the audience hears my words and ignores my gestures. They are engulfed in a personal assessment of their own Data Governance maturity. Huddled over the laptops, they discuss their perceptions of the model levels, argue about what the terms mean, relate the observed behaviors of 50 companies in North America and Europe to their own habits.
It is fascinating to watch! They don't want to move forward to new categories, as each level brings forward painful memories of immature practices, problems long festering needing change, and the re-awakening that they too are immature and can change with an external assessment.
Four years after its creation by a group of 50 visionary Data Governance Council members, the Maturity Model still inspires and provides fresh evidence of its value and relevance. It excites audiences all across the world, and as a benchmarking tool there is no comparison. Every time I do this I wonder to myself how this material can excite as it does. But it is the common awareness of ad-hoc, episodic, IT adventures, crises, and budget constrained fixes over decades that motivates people to realize that their situations are not unique and that only systemic solutions will work.
After all these years, Data Governance is a real global market and the real work to make it a success just now begins.
Thank you Manila.
ComplianceWeek covered the XBRL Risk Taxonomy Forum Meeting in NY last week with an excellent article enclosed here.
It is a longer article, but this is from the front page:Using XBRL to Attack Systemic Risk
By Todd Neff — April 7, 2009
Already hard at work making Security and Exchange Commission filings interactive, XBRL technology now finds itself at the heart of plans to save the U.S. financial system from future calamity.
A group of risk-management leaders in the financial industry has begun studying how XBRL might bring clarity and transparency to the murky world of financial risks, much the same way Corporate America has just begun using XBRL to bring more clarity to financial statements.
While any such system is a long way off, proponents say the technology is tailor-made to help regulators (and investors) root out hidden threats to corporate balance sheets before they, well, break the bank. XBRL could, for example, let a regulator peer through a bad debt line item and see the individual loans feeding it; that task would take hours of spreadsheet diving today.
But XBRL could also do much more. Steven Adler, director of IBM Data Governance Solutions, says the computer language provides a standard vehicle for regulators to track not only weeks-old summary data, but also financial positions accruing across many banks and market segments. That would shed more light on systemic risks—which, left unchecked, can bring financial calamity of the sort we’re witnessing today.
Any potent XBRL-based scheme to report risks, however, would require the reporting of daily financial positions, a major shift in how trading firms, hedge funds, and investment banks do business. To that end, Adler’s IBM Data Governance Council is spearheading a movement that would change how investment banks and hedge funds interact with regulators.
“At this point, everybody is aware change is coming,” Adler says. “And parties would rather be in the room together talking about common solutions.”
A speech Federal Reserve Chairman Ben Bernanke delivered last month shows him to be in agreement. Bernanke advocated taking a “macro-prudential” approach to risks that are “cross-cutting,” affecting many firms and markets or concentrating in unhealthy ways. It would involve “monitoring large or rapidly increasing exposures—such as to sub-prime mortgages—across firms and markets.”
You can read the full article here.