On February 27-29, I hosted the 15th meeting of the Data Governance Council at the Wales Hotel, in New York City. 31 people registered to attend this meeting, including 16 IBMers, and representatives from JPMC, Bank of Tokyo/Mitsubishi, Bank of Montreal, Key Bank, State Street, MasterCard, and American Express, OpenPages, Axentis, Varonis, and Vericept.
On the first day, we had excellent keynote presentations from Garrick Utley, President of the Levin Institute, and Will Pelgrin, Director of the NYS Cybercrime Taskforce. We also had some good roundtable discussions on common challenges in Data Governance related to Sub-prime, Basel II, and other issues. On the second day, we continued discussing common challenges and reviewed IBM Data Governance Solutions with regards to Policy and Process Management, Data Modeling and Development, MDM, Metadata and Data Quality Management. On the last day, we left the agenda and had a long discussion on the future of the Council. Cal Braunstein rounded out the event with an excellent closing keynote on the risks to and from Data, and the risks to organizations from data we can't trust.
We spent a lot of time talking about Globalization and it's effects on competition, regulation, cybercrime, and risk. Globalization is having a corrosive effect on trust in many organizations. Pressure from regulations requiring oversight and reporting of employee use of IT increases distrust at all levels. Cybercrime and the increasing financial value of data challenges everyone with offers and scams that make it hard to trust information. These factors are creating internal crises in trust and confidence. The manipulation and monitoring of information technology by people over other people threatens the quality and value of decision-making at a time when global competition brutally punishes bad decisions.
The Globalization of threats, risk, regulation, and competition will immediately force organizational decision-making inward, towards hierarchical models of decision-making, even as the globalization of markets, labor and resource allocation forces more horizontal changes in culture, lifestyle, and freedom.
This Council has existed for three years, and many members, by virtue of their participation, have achieved more mature levels of Data Governance. They have cross organizational governance models, better transparency and better decision-making. Many newer members are just now exploring organizational models, business vs IT participation, the nature of Stewardship and the complexities of overcoming organizational stovepipes.
Enclosed are my notes and observations from this landmark meeting:
1. Data Governance Market Maturity: Data Governance as a market is maturing from the Innovator phase, where a few leading companies worked together to blaze a trail for others to follow, to the early adopter phase. We are clearly seeing some leading companies succeed with Data Governance, thanks in part to the Data Governance Maturity Model, and many many more now coming into this market looking to build on the success and experience of the innovators.
For those of us pioneers, this is a time of change, and we must adapt to a new market constituency requiring education and solutions with somewhat less tolerance for discovery and invention. The Data Governance Starter's Guide should be updated as an educational onboarding tutorial for new companies seeking Data Governance success. For vendors, this is a time to study solution packaging and focus on the support needs of the stewardship community. Stewardship is a profession still in its infancy, and it requires practitioner tools, education, and community forums to exchange practices and success stories.
We should all be proud that our contributions have move the market to this new phase, and the Council needs to change to grow with the Market.
2. IBM Data Governance Solutions: IBM has come a long way in its Data Governance Solution capabilities since 2006, which was the last time we had a major showcase of technologies on the Council Agenda. Most of our solutions - Compliance Warehouse, Integrated Data Management, MDM and Industry Models, Data Quality and Metadata tools - were very well received. But this Council has succeeded exactly because it is not a normal IBM Customer Advisory Board, where normal meetings are dominated by IBM solution exhibitions. Rather, it has succeeded as a unique forum for practitioner exchanges, and it must remain this way to continue.
Future meetings will be shorter, practitioner driven, and IBM will find additional venues to present Data Governance solutions.
3. Globalization: At Mohonk in 2004, at the inaugural Data Governance Summit, I presented some ideas about how information technology would transform the modern corporation, and how integral Data Governance would be to that process. I was heavily influenced by Tom Malone and his book the Future of Work, and also by the history of industrial regulation at the dawn of the 20th Century.
In NY, we re-examined some of these topics through presentations from Garrick Utley, Will Pelgrin, and Cal Braunstein, and I think we need to continue examining how the global pressures on information technology, regulation, cybercrime, risk, and transparency will impact Data Governance and organizational behavior. Many companies that have embraced Data Governance have stopped short of embracing x-organizational governance bodies with real authority. Most don't know which models to follow, examples of success to emulate, how it should work.
In my travels I've seen many governance models in corporate and national entities that offer some hope to modern organizations, and I think we ought to be the Council that inventories these models, compares their pros and cons, and presents alternatives to hierarchical organization.
4. Data Risk Standards: In the Xiao Dynasty in China, rulers practiced Risk-based decision making by consulting an Oracle, who dropped an Oxen hip bone on the floor and deciphered the direction of the crack in the bone as indicative of divine truth. If the crack pointed up, you had good favor for your decision, down, well you better ask again. People consulted the Oracle on every kind of decision - dental surgery, marital options, taxation, or war - and they would drop 6-9 ox bones and average the results, thinking that more data would provide more accurate results. Every question to the Oracle was journalized, and outcomes were constantly compared to the ox-bone forecasts. Records of these inquiries survive today, providing the oldest known risk forecasting models. Three thousand years ago, this was the first form of risk-based decision making, and while it may seem primitive to us it was at least systemic which is more than we can say about ERM practices today.
Enterprise Risk Management today is still a voodoo art practiced by a secret society of Risk Managers in a language few understand. It is expensive, bespoke, non-standard, and under-utilized. Market, Credit, or Operational Risk consequences are not understood by the vast majority of employees who make enterprise decisions because none of them have access to even Oxen bones today, let alone risk-based forecasting models that allow decision makers to compare options, forecast outcomes, and compare results to the forecasts.
To get to that state, where ERM is a common discipline that every employee can use for enlightened decision-making, new Data Risk standards are needed, to make ERM simpler, cheaper, and more systemically repeatable, and that is another contribution this Council can make. We will next meet on June 26th at the Federal Reserve in Washington, DC to explore that opportunity in depth.
What was evident at this meeting is that Data Governance challenges have changed in three years. We are still at the cusp of changes in the way modern, post-industrial, organizations are governed. Even the most mature members of the Data Governance Council have not substantially changed the way their organizations perform decision-making. It is still top-down, barely delegated, with little or no trust extending from the top to the bottom of an organization. Many governance bodies or teams have little or no direct decision-making authority - neither funding mandates nor project veto powers. The light of information still shines brightest from the bottom-up, with those at the top getting the best view of the light and those at the bottom simply blinded by it.
We need new models of organizational governance, new data standards in ERM, and renewed investment in risk-based decision making at all enterprise levels. This remains the challenge of Data Governance in the early adopter market evolution.[Read More
While academics contort over the rise of successful bank lobbying on Capital Hill, Jack Reed has introduced the Rating Accountability and Transparency Enhancement (RATE) Act of 2009
, which "would provide new oversight and transparency rules for Credit Rating Agencies." This is a serious bill with excellent ideas that will do more to correct one area of abuse in the credit crisis than many other current proposals. Credit Rating should be transparent so that market participants can validate rating methods and the SEC can provide oversight and audit over problems and failures.
RATE includes further strengthening of existing regulatory structures, with new authorities provided to the SEC. But the important component here is new rating disclosure requirements which would make the methods credit rating agencies use to rate bonds, MBS, CDO, and other derivatives transparent and auditable. I also like the proposal for a new independent Compliance Officer, which is a power long overdue in ALL corporations.SUMMARY: The Rating Accountability and Transparency Enhancement (RATE) Act of 2009 (http://reed.senate.gov/newsroom/details.cfm?id=313172)
bill strengthens the Securities and Exchange Commission’s (SEC)
oversight of Nationally Recognized Statistical Rating Organizations
(NRSROs) through enhanced disclosure and improved oversight of
conflicts of interest, and makes credit rating firms more accountable
through greater legal liability.
Accountability of NRSROs
NRSROs liable when it can be proved that they knowingly failed to
review factual elements for determining a rating based on their
methodology or failed to reasonably verify that factual information.
the SEC to explore alternative means of NRSRO compensation, and
requires a Government Accountability Office study on payment methods,
in order to create incentives for greater accuracy.
• Establishes an office in the SEC to coordinate activities for regulating NRSROs.
the SEC to ensure that NRSRO methodologies follow internal NRSRO
guidelines and requirements for accuracy and freedom from conflicts of
Due Diligence Certification
certification if due diligence services are used to ensure that
appropriate and comprehensive information was received by the NRSRO for
an accurate rating.
NRSROs to notify users when model or methodology changes occur that
could impact the rating, and to apply the changes to the rating
• Requires the SEC to establish a form for NRSROs to
provide disclosures on ratings, including methodological assumptions,
fees collected from the issuer, and factors that could change the
• Requires NRSROs to provide rating performance information, such as information on the frequency of rating changes over time.
Conflicts of Interests
NRSROs to have an independent compliance officer to manage conflicts of
interest and independently review policies and procedures governing
ratings so they are free from conflicts.
• Requires the SEC to regularly review NRSRO conflict of interest guidelines.
a look-back provision requiring that if an NRSRO employee later becomes
employed by an issuer, the NRSRO must review any ratings that the
employee participated in over the previous year to identify and remedy
any conflicts of interest; and provides for SEC reviews of NRSRO
look-back policies and their implementation.
I see this bill as another indication that financial regulatory reform will fix underlaps and gaps in existing authority rather than build a new systemic risk regulatory institution.
Every year, I teach a course at the Bucerius Law School
in Hamburg, Germany. The Bucerius is the only private law school in Germany, and together with the WHU - the only private
business school in Germany - we offer a Masters in International Business and Law
, and it's in this program that I teach.
Each year we have have 55 students from around the world who participate in an intensive program for graduated lawyers and MBA's. they take 6 modules of courses and do a spring internship at a company in Europe or North America. I taught at NYU for a couple of years, but enjoy the international atmosphere at Bucerius better. It's a fantastic program.
Teaching is a bi-directional activity. My classes are always in workshop form. I bring some expertise to the class, and my students challenge my ideas and together we learn.
This year was special. In past years, the course was called "Data Governance." In January, I changed the title to "Smart Governance," and the change was more than cosmetic. The material was entirely new. I fashioned the course around a book called Smart Governance
written by Helmut Willke. Helmut writes some very important ideas about how the knowledge society is changing national sovereignty and the rise of expert NGO's. Helmut's definition of Governance is one that I often quote - The communication activity of coordinating human beings to achieve common goals through collaboration." It's a brilliant sentence that succinctly captures so much. I extrapolated many of his theoretical ideas into what I think is a more practical Governance System that is the core of the Six Steps to Smart Governance
. I used concrete examples of each step in class to contrast with the more theoretical use cases in the book. And I had the students prepare homework assignments geared around individual chapters in the book, asking them to write about their own experiences in this context.
This worked up to a point for the first two days. There were 12 students in the class meeting in the second cold week in January. Hamburg was shrouded in snow. The reading material is dense and difficult, but the class lectures were fun and interactive. On the third day of class, which was a Saturday morning, we were talking about the difference between a Governance System and the policies one would wish to implement. A Governance System is a scientific process by which people try to set goals, define measurement metrics, make policy decisions, communicate the policies, audit the outcomes, and seek to continuously improve.
Even with my best examples, the students found this concept of a Governance System hard to grasp. I searched for metaphors and examples, and could find nothing familiar they could latch on to. So I turned the example inward and transformed the class relationship from teacher/student into a Governance Council and asked the students to help each other to govern the course structure for the rest of the class.
We started with the grading process and the work assignments. I had planned on giving homework every night, for four nights, each being worth 20% of the grade and a short final paper also worth 20%. Students don't like homework, and frankly teachers don't like grading that much either. We had already had two assignments, each worth 20%, and I opened the discussion up as to what work to complete and how much each should be worth for the rest of class. I was a bit nervous about this surrender of power, but what ensued next was absolutely magical.
We discussed various options for about an hour, and in the end the students agreed to do no more homework assignments, a final paper worth 40%, and a participation grade worth 20%. It was not a radical solution, but the students had chosen it. It was their grading structure, and they felt empowered and energized that they made the decision. For me, it was a terrific solution as well because with the students feeling like they owned the grading structure all the teacher/class power structure and tension evaporated. We were now peers collaborating on a common goal.
Next came a discussion on what to write about for the final paper. I had in past classes asked students to prepare papers describing hypothetical governance models for banks or other commercial entities. But in this class, the idea that students could shape their grading structure quickly transformed the discussion into one about how the students would like to also participate more effectively in the governance of the Bucerius itself.
On the first day of class, I had taken the class list and befriended each of the students of Facebook. At the time I was just learning about Facebook myself, and thought it would be interesting to link up with the students directly. I knew some High School teachers who did the same with their students in the USA and wanted to see what it would be like to do it myself. The Bucerius students were amazed that I had done this and not only quickly befriended me but also told all their peers that Mr. Adler was on Facebook. What I saw over the next two days was that I was the only Professor or Lecturer at Bucerius who was on Facebook, though almost all the students were using Facebook for primary communication.
Bucerius, like a lot of universities worldwide, uses an IT system called CampusNet, which provides online classrooms where lecturers can deposit class materials and students can discuss them. It is slow, cumbersome, and hierarchical. The Bucerius Students were using Facebook instead to organize themselves and discuss their classes. They did this for two reasons; 1, Facebook is democratic in structure, and 2) it's fast, easy to use, and everyone is on it.
This proved to be a great example of one of Helmut Willke's central ideas - how hierarchical systems produce apathy from below. Given an opportunity, people will almost always gravitate towards more democratic communication systems in which they can openly express their ideas and communicate with anyone whenever they choose. This is a very important consequence of the Information Revolution. People, all over the world, are using Facebook to communicate across gender, race, and international borders, about important ideas. Their participation is not dictated by hierarchy, government, or inherited status. They are a market of communication and ideas, and their ability to self-coordinate and collaborate has very important consequences in the world.
This new form of communication - Facebook - became the focus of our class discussion on what to write about in the Final Report. Students wanted to influence how the Bucerius was governed. They lamented the fact that no other lecturers were on Facebook, that most lecturers in fact had not attended any of the classes of their peers. They wanted better IT support, to retire Blackboard. But most importantly, the students did not feel consulted regarding their needs and their relationship to the University. And this was the most fascinating part. Their experience on Facebook, interacting in a marketplace of ideas, had changed their expectation of their own role as students at the University. They saw themselves not as recipients of education from learned university professors. Rather they saw themselves as skilled professional peers in a bi-directional relationship of learning and they wanted a voice at the table.
My students wrote four excellent papers, describing new Governance System proposals for the Bucerius Law School. These are important ideas. They illustrate changing expectations in previously hierarchical relationships. Every University and every Business around the world should take these ideas very seriously. They are the future.
I have permission to publish one of them, under a Creative Commons copyright, and am doing so with full attribution. This paper comes from:
Johannes Becher, Germany
Courtney Fischer, USA
Zitto Kabwe, Tanzania
I encourage everyone to read it : Governance System for Bucerius Law School.
They wrote a research paper. In it they proposed a system of governance that would take into account the measured needs of students each year and articulate those needs as policy. Will it change anything at Bucerius in terms of official Governance models? Perhaps not.
But what was written about is change that has already happened. Information technology like Facebook is providing an alternative mechanism for people to coordinate their activities to achieve common goals through collaboration. This mechanism has security flaws, IP issues, and can lead to significant opportunities for abuse. And despite all that, people will use it and their use will force all of us to change ever more rapidly just to keep up.
It is that simple.
You have supply chains that deliver toys from manufacturers in China to sit under Christmas trees in Canada, oil and gas from Russia to factories and homes in Germany, Diamonds from mines in Namibia to jewelers in New York.
Real World supply chains keep the global Industrial Economy running.
Alongside, you have Information Supply Chains that deliver crop yields to traders on the Chicago Mercantile exchange, raw video footage from journalists in Afghanistan to news desks London, Paris, and Atlanta, and sales performance reports from branch offices in Omaha to main offices in Arkansas.
Around the world, Information Supply Chains drive the Knowledge Economy.
They need to be Smart - Instrumented, Monitored, Measured, and Coordinated. And we need to be aware of when they are designed, what flows through them, and how we can improve them.
Without awareness, Governance itself can never be very Smart. It is that simple.
The two most historically important developments of the last two decades are the growth of global markets and the speed of information technology development. Markets and IT are transforming the world at a faster rate than any other developments in human history. And they are also challenging Governance Models in ways that are equally profound. Kings and Crowds fought it out politically at the dawn of the 20th Century when the ancient Russian, Chinese, Austrian, and Ottoman empires fell, and they are battling commercially today in many markets in which Crowds are winning again.
1. Product Development
I'm an audiophile. I buy expensive audio equipment in the hope of reproducing an emotional connection with music in my home that people feel when they attend a live concert. Being on a limited budget, I'm also a cheap audiophile. I like the best product for the lowest cost, which is one reason I applaud globalization. Over the last decade, high quality, low cost audiophile equipment has been coming out of China that rivals the best high cost gear manufactured in North America and Europe. Some companies have setup local design and Chinese manufacturing with online distribution that brings incredible bargains to mainstream US and European consumers. Two such companies are Oppo Digital and Emotiva.
Oppo makes DVD and Bluray players that are designed in San Francisco and manufactured in China. I've owned their products for several years and am always impressed with their price/performance ratio. But now I'm even more impressed with their product development process. In 2008, they announced the development of a new Bluray player, the BPD-83. These days, consumer electronics are more like computers than audio equipment, with complex Digital Signal Processors, graphics chips and CPU's interacting in intricate designs. Oppo knew product development would be difficult and testing even more so. With the complexity of hardware and software in one appliance it is really difficult for a small team of product designers and marketing professionals in San Francisco to test against every potential usage scenario. And when manufacturing is outsourced to China it is even harder. Distance, language, and culture create barriers that make communication a new challenge.
In this environment, Oppo decided to outsource product testing to its customers by using a Crowdsourcing solution. Several hundred customers received pre-production units of the BluRay player and tested it in their homes. Their product feedback went to the design team who translated feedback into design changes for the manufacturer. The Crowd were given the option to vote on final product readiness. The first vote sent the product back for more changes and fixes in late 2008 and the second vote in Spring 2009 released it for GA in June.
I bought the product in July 2009 and it is superb. I contrast this to Emotiva, which is also a small design team based in Tennessee that manufactures in China. They make outstanding AV amplifiers, speakers, processors, and other equipment. In 2007, Emotiva announced a new AV processor, the UMC-1, for delivery in 2008. That slipped to early 2009, when it was announced that the product would ship in June. In July, the company announced it discovered bugs in the production units from China and would need a couple of months to fix them. By October, more than a couple of months went by and customers were fuming on the company's forums about the delays and the poor communication. In November, the company announced it would begin shipping to the pre-order list and many customers anticipated units before Thanksgiving. By early December, no units had shipped and the company had to start censoring its Forum because customer rants were getting abusive. The Emotiva CEO promised some customers would receive their units by Christmas, and when that didn't materialize many Forum members started talking about buying alternatives.
Last week, Emotiva finally began shipping a handful of units to pre-order customers without manuals. The first reviews appeared over the weekend and talked about stunning video quality but also a few audio and connectivity glitches. The CEO posted a very nice note on the Forum describing the company's pride in the product but also that a firmware release would soon be forthcoming.
So what this company did was use its customers for an unannounced Beta Testing program. They shipped their product very late to market, after a year of inconsistent market communication, with bugs they were probably aware of but couldn't fix without suffering more brand damage.
Contrast the two companies. Oppo used a market based Crowdsourcing mechanism to recruit customers to beta test the new product. The customers who participated in the testing provided open feedback which was visible to all members of the company forum. They fixed bugs quickly and used customers to determine when the product was ready for shipment. That process created customer loyalty and ensured a bug-free product that shipped only six months late. Emotiva used a hierarchical mechanism of in-house testing and opaque customer communication to ship a product more than 18 months late and filled with bugs that alienated customers and reduced brand loyalty.
Some people might say these companies have different approaches to product development or customer service. I abstract these situations as examples of governance models in complex social systems. Oppo used a market-based governance (coordination and cooperation) model and succeeded in satisfying the needs and interests of its market participants. Needs and wants are at a primary market level. Feedback Information about the product are at a secondary market level. Emotiva used a hierarchical governance model (command and control) and failed to satisfy secondary market interests in information and primary market needs for products.
This doesn't mean that market mechanisms always trump hierarchical control. But when a small number of people are trying to govern complex systems for consistent outcomes, a market-based model can be more efficient and produce better results.
2. Cost Containment.
My boss sent me a note over the weekend reminding me to use our ATT Calling Card from land lines when I am travelling abroad. It seems my cell phone bill in November was higher than the accounting police think necessary. All calls above $100 qualify for an immediate audit. Its not clear from my bill if any of my calls were or could be audited, but my boss, who is altogether a terrific guy, wants me to avoid that root canal and work smart abroad. Being a Governance Guy I do have to question the intelligence of a governance system that controls costs through managerial oversight of cell phone bills and automatic audits for $100 calls.
If there is already a trigger for automatic audit at the $100 per call threshold then someone has already noticed a pattern of calls that exceed $100. That kind of pattern calls for a policy change, but automatic audits require a fair degree of manual labor - both from my boss and the auditors. Wouldn't it be far Smarter to develop policies that cause the cell phone users themselves to police their own usage by giving them alternative means to reduce costs?
Some might argue that the warning note from my boss is a policy tool being used to change my behavior. But because the billing system is deliberately opaque in IBM, it isn't possible for me to evaluate the impact of each of my calls on the overall phone bill I incur each month. I can't see the incremental impact of my decisions as Risks to The System as a whole.
A more intelligent approach to cost containment in this case would be to toss the issue out to the Crowd of cell phone users in IBM and get them to come up with ideas to mitigate costs for each user. That process would include users in the decision-making process, getting them to brainstorm ways to reduce costs instead of treating them like cost creators in a hierarchical model to impose control.
Like, shouldn't IBM have Skype strategy for global travelers who make calls in cars and trains so that productivity isn't imperiled while costs are contained?
Crowds and Kings. What do you think? Post a comment and let me know.
Please join us for an international crowdsourcing experience!
In May 2006, the IBM Data Governance Council used poster board and sticky notes in an oak paneled room in the Chateau Frontenac in Quebec City to create the categories, elements, and levels in the first version of the Maturity Model. About 35 people
participated in that process in Quebec, and perhaps another 50 more in subsequence meetings.
On September 14-16 2010, the Council will use social networking crowdsourcing technology to include a global community in a discussion about the Maturity Model - Live!
Suggestions and comments from practitioners all around the world will be relayed to the participants in the room.
Of course, this venue is awesome, and there is no substitute for live, face to face, communication. But if you can't travel to Tamaya, and spend three fabulous days with The Council in the Desert, you can still tune into the action by going to infogovcommunity.com.
In the room or in Rangoon, you can watch the ideas flow and chime in live or tune in later and add your views.
Either way, what you contribute will impact the community and change the Maturity Model. Synchronous or Asynchronous, this meeting is the beginning of a global dialog on Data Governance Maturity.
What we do in the room will make a difference. And what you contribute from your own room will make a difference.
Please join us in Tamaya or online at www.infogovcommunity.com to capture the best ideas from the Global Information Governance Community, contributed for the Community and published in an open-sourced IBM Data Governance Council Maturity Model.
This is how we innovate!
Steven B. Adler
IBM Data Governance Council
On Tuesday, I gave a keynote presentation at SIMposium 2010 in Atlanta, Georgia. It was on the last day of a conference at 8:15am. On the best of days, I'm not a great morning person. The last day of a conference is not normally the best of days for a presentation. Normally, at least half the participants are in taxis on the way to the airport and the other half are often exhausted from the content and discussions on the earlier days. When I was first asked to speak, I was not inclined to do it. Keynote or not, 8:15 on the last day felt like a bad proposition.
I could not have been more wrong. First, the room, and it was a huge ballroom, was full with about 300 people. Second, they were awake, animated, and fantastic to talk to. We had a great conversation together, and I completely enjoyed the interaction.
Third, they were not the normal Data Governance crowd. In fact, when I asked how many had Data Governance programs at the start of my presentation not one hand went up. This is the kind of group I love talking to and they are the ones we most need to reach.
SIMposium, thank you for an excellent experience. Many have since requested my presentation and here it is in Flash format. Just click on the link below and it will launch in your browser.
SIMposium 2010: Change is Not Just a Word
This morning, General Motors announced that it would no longer advertise its cars on Facebook. This announcement comes a day before the Facebook IPO, and casts a shadow on the business model of Facebook. GM said that they will continue to support their page and user community on Facebook, but that ads just weren't effective in helping consumers to make car buying decisions. Ford jumped on this announcement to say they would continue to buy ads on Facebook and that Social Media requires a consistent commitment to innovation and community development.
Maybe. But I think GM's decisions does illustrate a key problem for Facebook and Twitter - the revenue model. Social Media grew up without dependencies on ad-based revenue. On Facebook, you aren't a customer. You are a product, and its your likes, dislikes, friends, photos, videos, and content that generate value. Selling products to products via advertising is hard. Members don't use Social Media to go shopping. There's no commerce platform there. They use it to be social. There are so many other outlets that are more effective for advertising than Social Media.
So how should Facebook and Twitter make money? My idea: make it collective. The value is in the data.
1. Make terms and conditions explicit that every member owns their own data via copyright. This does two positive things.
A. It indemnifies Facebook and Twitter for the crazy, infringing, and potentially libelous posts of their members by allowing them to claim that they are conduits of content rather than publishers or distributors.
B. Copyright establishes the rights to royalties for content created and posted on their networks, which enables the next step.
2. Allow members to opt-in to Big Data analysis by Social Media partners and intermediaries.
3. Charge Social Media for Big Data Searches by data volume.
4. Pay members royalties every time their data is used in Big Data Searches.
This simple model creates powerful incentives that transform user members from products into mutual social network content providers with an economic interest in posting content that will be used in Big Data searches. It establishes data property rights that insulate Facebook and Twitter from vouching for the content on their networks. Members will also discover that providing high quality data that companies want to search for means more royalties and so the system will produce better behaviors. And it creates a 2-tier royalty distribution model that will also pay Facebook and Twitter handsome revenue that will change online advertising and make every other content aggregater change too.
Of course, Facebook and Twitter will have to sort our who's a person and who's a bot, and will have to provide content creation tutorials to help users/customers create content that has value by sharing the top 100 Big Data queries and sample results.
But this Business Model has something for everyone and is a true win:win. It benefits customers by establishing data property rights and royalties for content. It benefits organizations who want to do Big Data searches by providing ever richer data streams of high quality and availability. And it benefits Facebook, Twitter, and their investors by providing an enormous profit making engine selling Data.
The Data is the Value. The more there is, the more valuable it becomes. Pay your customers to create higher quality data and charge your partners to use it. Its a simple Business Model.
Dick Costolo - @dickc - and Mark Zuckerberg - @finkd - are you listening?
Data=Information=Knowledge. Or so we would like to say. In theory, data is unorganized information, and knowledge is information put to use by human beings. But theories are for academics. And this theory is super convenient if semantic consistency is important. There are Data Architects who only think about data in databases, Information and Content Architects who only work with unstructured repositories, and even Knowledge Architects who I suppose work with information taken out of human brains and put into... structured or unstructured repositories on computers...
In real life, in real companies, these are artificial distinctions. Organizations want to control data/information supply chains because they are full of quality control problems, security vulnerabilities, compliance challenges, and operational exposures. Those risks imperil decision-making, increase operational costs, and reduce revenue opportunities. Quality control and risk mitigation are challenges for every data type.
Five years ago, "Data Governance" seemed like a great name for a new discipline to help transform organizational behavior from vertical to horizontal; because information is transformational. What we meant then and mean now is not just about "Data" in the purest structured sense. We mean Data in the most plural and unlimited sense. People want to govern other people's use of all kinds of information in every form.
No data stovepipes please! We need Data Governance Solutions for all human uses of information regardless of their form or structure, use or abuse.
Anyone who tells you different is just so 20th Century...
Two years ago, I met Helmut Willke, the author of Smart Governance: Governing the Global Knowledge Society, at a hotel cafe near the great cathedral of Cologne. Professor
Willke is a sociologist who teaches Global Governance at the Zeppelin
University in Friedrichshafen, Germany. Late in 2009 I became
interested in Governance as a system of decision-making and Professor
Willke had written an excellent book exploring this topic. While the
Professor is German, he writes extremely well in English and his book
very well written and insightful. Like a lot of philosophical texts, it
is not an easy read. Dense descriptions, long sentences, and theory
backed by ample example make it a book you have to read at least twice
to fully comprehend.
I was in Cologne in late February 2010 to meet the CIO of the City and attend Rosenmontag at City Hall
. I had already seen several days of Karnival, with the endless parades, costumes
and candy strewn about the streets. For five or six days in February,
the staid and reserved city of Cologne becomes an endless drunken party
attracting visitors from all over the world who wear outrageous costumes
and march in parades on incredible floats and throw candy to the
bystanders. Its unlike any parade I have ever seen. Quite amazing.
It had snowed a lot that year. It was white from Brussels to Berlin,
and Cologne was still covered by eight inches. The square in front of
the Dom was clear, and I had spent the morning before our meeting
visiting the Roman museum across the square. Cologne is an ancient
Roman city and the ruins are collected in a fantastic museum right next
to the Dom. Of course there are columns and pediments, but also beautiful mosaic floors, jewellery, stained glass,
and decorative arts. There is a model of the Roman city and you can
see how the Germans built the city on the same street grid with walls
built on top of the Roman walls. Of course, much of this was destroyed
by allied bombs in WWII, but some remnants remain.
Looking back at Roman colonial rule of Cologne was an excellent
introduction to the systemic ideas of Governance Professor Willke and I
discussed over coffee that afternoon. He is not a tall man, mostly grey
late-50′s I would say, with bright blue eyes. He makes an immediate
impression, and is passionate about his book. I had used the book as
text for a class I taught at the Bucerius Law School on Data Governance
in Hamburg that January. My students did not entirely appreciate the
dense prose and abstract ideas, but through class conversation we did
ultimately appreciate the idea that Governance is a system of
decision-making that could be described and modelled. And we used
Social Networking metaphors to explore the idea of policy-making, human
behaviours in a system of Governance, and how to model potential
outcomes. Of course there is political science, which describes
political models of Governance – Democracy, Dictatorship, Monarchy, etc –
but what is unique and important about Professor Willke’s book is the
application of systems theory to Governance.
We had some coffee and talked mostly about how the Professor wrote
the book and why. As I had in 2007-8, the Professor had used the Global
Credit Crisis as a use case to describe failures in Governance. I had
covered this topic from a Data Governance perspective, arguing that
hundreds of incremental failures in business processes and data quality
had produced a domino effect that plunged the global economy into
Depression. He covered the topic from a decision-making perspective,
and while we approached this topic from different directions we arrived
at similar conclusions – policy-makers can’t possibly make the best
decisions without understanding the consequences of those decisions on
incredibly complex and interconnected global systems. And those
consequences are impossible to understand without new information
systems that render the complexity with software and illustrate how the
policies will be accepted and resisted.
In my class at Bucerius, my students complained that the Professor
had not done enough to provide solutions to the problems he had
identified, or that his solutions were too abstract. I presented these
criticisms to him at our meeting and he responded that it was not
possible to offer concrete solutions because every systemic problem
needs to be modelled to understand the variables and outcomes – that
there is no one size fits all. At the time, I thought this was a
dodge. It took me a few more years to understand that he was right.
There are no Governance Solutions that can auto-magically produce the
best outcomes for every decision. But it is possible for policy-makers
to use systems theory and software to construct decision-making models
that can plot many of the actors, objects, variables, and potential
outcomes to understand the impact of policies on complex systems made up
of hundreds, thousands, and even millions of human beings with unique
After my course, I synthesised concepts from the book with ideas from my students to create the Six Steps to Smart Governance.
It’s not meant to be a Framework. Frameworks and models are nice tools
to help people feel more secure about challenges they seek to overcome,
but they are not useful in making better decisions. The Six Steps are
meant to be a structure for decision-making that one would apply
iteratively; in which each of the six steps would involve different data
points and variables. Of course, it is highly summarised, flavoured
with marketing. And I would say in hindsight, its not really useful as a
practical or operational tool. It’s really just a theory, a
simplification of the better documented ideas Professor Willke writes
about in his book.
And I think we can do better. In the IBM Data Governance Council we
will soon begin to explore dynamic simulation models that go far beyond
the Six Steps to Smart Governance, and I recommend reading both the white paper and Professor Willke’s book:
Smart Governance: Governing the Global Knowledge Society
Today, thanks to really powerful simulation software, we can create
dynamic models that help demonstrate the impact of policy on people,
processes, and technology. The Data Governance Simulation Project will
revolutionise the field of Data Governance by applying theory, software,
and observed practices to an interactive model that will yield powerful
insights into Data Governance Value Creation and Risk Mitigation.
A lot of people ask me, “how do I show the value of metadata?” Some
say, “how do I make the business case for Data Governance?” Consultants
and Gurus will have a framework or process to offer you, a get started
guide with use-case examples, graphics, and legends about their
successes. But these myths won’t help you, because your challenges are
unique. Your politics are special, and your people are not machines.
Best practices are useful examples of glorified solutions that are very
hard to replicate outside the lab. And as many are already finding out,
people resist policies they don’t think apply to them and its really
tricky to understand how to change organisational behaviours on an
on-going basis without policies that dynamically change with new
Data Governance is, by nature, a systemic challenge and you can’t
solve systemic problems without systemic solutions. Projects and teams
that expect quick hits and 90-results are the reason you have systemic
Data Governance problems in the first place. But it is possible to
create software models that allow you to plot the goals, metrics,
policies, communications, outcomes, variables, and modifiers and
evaluate the impact of new policies and controls on your environment.
And that’s the lesson of Smart Governance: you can model complex
environments through Simulation and make better decisions. To learn
more about using Simulations to make better decisions, take a look at
the IBM Smarter Cities Demo.
In that demo, the complex interactions of human beings living in a city
are compared to the goals of human policies, the metrics measured by
interactions, and potential outcomes.
Many of our organisations are as complex as small cities. Policy and
Politics share the same ancient Greek root word – epolis. epolis is a
city, which itself is an aggregation of human beings who require
Governance to arbitrate their diverse interests and achieve better
outcomes for all. Today, we can simulate those interactions and help
Policy makers profile the impact of their policies before they are
deployed. Its a kind of Visual Risk Calculation.
If you would like to participate in the Data Governance Simulation
project, please read the Six Steps to Smart Governance White Paper, the book
by Professor Willke, and join the IBM Data Governance Council by executing this membership agreement.
Only members of the Council will be able to participate in this
exercise and you don’t want to miss this because it will fundamentally
change Data Governance.
Amazon has some Information Governance problems.
A week ago, I placed a large order of Nerf Guns that Amazon keeps refusing to process. My kids love these things and I guess some adults I know kind of like them too. We're all heading out to my sister's house in Point Reyes for Christmas this year and a combined Family Reunion. Both my sisters will be there with 7 kids in a medium-sized house for four days and the best we could all come up with to keep them occupied was felt-warfare among the tall grasses of the Inverness wetlands.
If only Amazon would cooperate.
I have no desire to carry ten Nerf weapons on trans-continental jets. I can see explaining to turgid DHS officials why a family of four needs automatic-nerf canons with heat-seeking velcro missiles. So, I prefer to order them online and let Fedex make the arms shipments discretely.
But my order is stuck in Amazon credit card limbo. It seems that the last time I bought something and shipped it to my sister instead of my home address I used a credit card which expired in May. Problem is, Amazon somehow associates that credit card with my sister's mailing address. I've deleted it in my online account, and I buy things from them all the time with the current card, but Amazon hasn't purged this relationship.
From an Information Governance perspective, what kind of problem is this? It is of course a Data Quality issue, but normal DQ tools might have a hard time with rules matching in this case. My gut is that Amazon just doesn't sweep and purge their accounts for outdated credit cards. Its pretty frustrating as a consumer, especially during these busy days. Some records management would solve that problem, but by now the point is moot for me. I just don't have the time or patience to bother fixing their sloppy Information Governance issues.
Fortunately, Walmart sells Nerf Guns too...
Governance is a communication system for measuring complex needs, articulating a systemic response in Policy, and enforcing that policy. When I say it is a system, I mean that it is a social system abstracted from the people and psychologies that perform various Governance tasks. The people may come and go, but the system remains largely the same.
Smart Governance includes some additional dimensions that make the system evolutionary as well:
1. Dynamic methods for collecting and analyzing needs in an organization or society
2. Hierarchical, Market-based, or hybrid political models for integrating diverse points of view into the policy-making process
3. Diverse communication tools for integrating policies into a variety of business, IT, and social processes
4. Methods to measure policy outcomes, compare them to original needs, and re-define policy to meet new requirements
5. Solutions to measure systemic risks, capture mistakes and losses, and enhance organizational intelligence and Knowledge as a Shared Resource through constant systemic improvement.
The goal of the System is to meet the needs of The Customer, without regard to governing ideology, personal psychology, or vested systemic interests, as well as to continuously diagnose deficiencies in the The Smart Governance System, collect organizational knowledge, and improve over time.
Smart Governance is a challenge for human as well as IT systems. We all understand politics. Few understand Governance Theory as a Sociological System abstract from psychological and political practices. In this way, Governance Theory is a communication science more similar to computer science in its architecture and schematics.
The history of Governance is entwined with the history of governing ideology. But ideologies impose systemic order without regard to evolving customer needs or changing collective goals. In a Knowledge Society, governing system ideology is secondary to customer needs and collective goals. The constant pressure to improve outcomes in a globally competitive world, makes ideology a tool rather than a purpose of governance systems.
Hierarchical, Authoritarian, Democratic, Socialistic, and Market-based governing ideologies are all potentially useful systemic policy-making tools in different governing contexts when meeting customer and collective needs with systemic policy response. These ideologies, as communication methodologies, can be used interchangeably depending on the governing policy requirements.
In 1992, Francis Fukuyama wrote a famous book called The End of History, in which he forecast that the end of the Cold War would see Western Liberal Democracy become the predominant world governing system and that ideological struggle as a function of historical definition was dead. From a Governance Theory perspective, this thesis is hopelessly simplistic. Governing Ideology will cease to be a definition of history when companies, nation states and trans-national organizations liberate themselves from the confines of singular governing ideologies and tailor governing systemic tools (ideological communication instances) to meet ever changing policy needs of customer requirement and collective goals.
I'm writing this blog entry in my hotel room on the 14th floor of the Grand Hyatt in Jakarta, Indonesia. Traffic screams by the massive fountain circle outside in a constant torrent of horns. I've been here all of two days. Met a customer in town this morning, and yesterday we drove three hours to meet a customer in Western Java. I've seen rice patties, jungle, mountains, tea plantations, small villages and ways of life unchanged for centuries, glittering shopping malls with every brand available, fantastic office towers, and levels of luxury unembarrassed by poverty in every street. It is at once fascinatingly familiar and different at every corner.
This year, I've visited customers in Jakarta, Manila, Tampa, Columbus, Johannesburg, Dallas, Hamburg, Warsaw, San Francisco, New York, Brussels, and Cologne. And every where I go I hear the same stories, the same issues, the same needs.
Data Governance is a global market. Everyone is doing it.
Tomorrow I fly to Bangkok, where Red Shirts have held a government hostage for six weeks. On the edge of a knife, a nation split Red and Yellow, and I'm hosting a Data Governance Workshop for 2 dozen customers.
The market need is hotter than Red.
If your company doesn't have a program working today, it's a competitive disadvantage.
Don't wait. Just do it.
Data Governance Programs are popping up all over the globe. It isn't hard to get one started anymore. But it is hard to be good at it and to make it last. In fact, I see more programs taking one step forward and two steps back – narrowing focus to demonstrate results – to fall in line with other IT projects than chart a clear path towards larger transformation.
But lets be clear – Data Governance is about Business Transformation. We can't change organizational behavior to take data seriously if we can't change how we work.
We in the Data Governance Council have a vision that Data Governance is a coordination of people collaborating on common goals and purposes – to use data as an asset. That vision requires that piecemeal project management of data issues must evolve into systemic governance structures and methods, whose goals and purposes themselves transcend the people, applications, and interactions.
Until last year, we didn't fully know how to close the gap between where we are today and where we'd all like to go. But today we see the way forward, and the Data Governance Council is embarking on a bold new program to develop Predictive Governance: systemic ways of describing our world and modeling potential interactions to understand what works and how to improve it.
Traditional scientific analysis says that to understand a problem you have to take apart the issue and decompose it into all its components and sub-components and find the root cause.
But this assumes there is always just one root cause and one thing to blame:
“Data Quality in our branch operations is atrocious, so we have to fix our incentive structure.”
“Our network was hacked and our customer data was exposed, so fire the CISO.”
Its almost irresistible to search for scapegoats to common problems using simple cause and effect analysis.
People rarely ever imagine that
Individual data quality problems are symptomatic of larger systemic challenges in the information supply chains we have created over decades to handle information flows from source to target;
and no CEO expects that network hacks are the result of systemic weaknesses in IT systems that are themselves a reflection of organizational culture and priorities.
Its hard to accept that people created the systems that enable Poor Data Quality, Global Jurisdictional Jungles, Metadata misunderstanding, Lax Security, Privacy Invasions, and Big Data Mischief. No one deliberately creates these problems. No one wants them to continue. But they do continue nonetheless because people really don't understand the elements and interdependencies of the systems they have created.
The point of Predictive Governance is that we work in large ecosystems and we must work to understand them. If we can't describe our ecosystems, we can't rise above the superstitions and organizational behaviors that constantly hold us back.
This event will explore the ideas and methods behind Predictive Governance, new Enterprise Data Governance Solutions that integrate multiple business and IT domains, and Internet Jurisdiction and Multi-Stakeholder Governance in the context of global regulatory confusion as an archetype of Predictive Governance Challenges.
These are big problems and we are working on big solutions.
See the agenda. Read our blogs. Understand our mission. Be prepared to interact.
This is a thought leadership forum for change. Join us and make a difference.
This event is open to all who wish to join the IBM Data Governance Council. Register to attend here: http://dgcouncil.eventbrite.com/
Last week, I became a victim of toxic content. It can happen so fast, without warning. My sister, a trusted source, forwarded two photos that purported to show the Air France flight breaking in half before it fell from the sky into the Atlantic off the coast of Brazil. There was a caption that said the photos had been taken by a passenger, and while the camera had been destroyed in the crash the memory stick was recovered. Even the photographer's name had been discovered by tracing the serial number of the camera. One photo showed passengers with air masks on, a gaping hole in the mid section of the plane and the tail section falling away. The second photo showed a man being sucked out into the open hole.
They were immediately shocking photos, all the more so to me because two of my students from my Data Governance course at the Bucerius Law School died on that flight. Alexander Crolow and Julia Schmidt were two bright young students from Germany and Brazil who had traveled to Brazil to tell Julia's parents of their plan to marry and were returning to Germany that night to tell Alex's parents. An event like the Air France crash it transformative when you know someone who was on it.
But alas, the photos were fake. They were taken from the TV Show lost and sent around the world in an email. Bolivian TV even showed them on the air before they discovered the fakery. But by then the damage had been done. For so many people around the world wondering how their loved-ones perished in that plane, the photos offered chilling illustration. We should have recognized the forgery at the outset since the plane crashed at night and the photos showed bright daylight through the hole. But critical thinking disappears quickly when you are emotionally involved. And of course on the internet any trusted source can inadvertedly be a conduit for toxic content. Thus knowing the source of your content is not enough to establish trusted information. You need to verify by corroborating the content with another source to establish veracity.
In the 21st Century everyone has to be a journalist.
ComplianceWeek covered the XBRL Risk Taxonomy Forum Meeting in NY last week with an excellent article enclosed here.
It is a longer article, but this is from the front page:Using XBRL to Attack Systemic Risk
By Todd Neff — April 7, 2009
Already hard at work making Security and Exchange Commission filings interactive, XBRL technology now finds itself at the heart of plans to save the U.S. financial system from future calamity.
A group of risk-management leaders in the financial industry has begun studying how XBRL might bring clarity and transparency to the murky world of financial risks, much the same way Corporate America has just begun using XBRL to bring more clarity to financial statements.
While any such system is a long way off, proponents say the technology is tailor-made to help regulators (and investors) root out hidden threats to corporate balance sheets before they, well, break the bank. XBRL could, for example, let a regulator peer through a bad debt line item and see the individual loans feeding it; that task would take hours of spreadsheet diving today.
But XBRL could also do much more. Steven Adler, director of IBM Data Governance Solutions, says the computer language provides a standard vehicle for regulators to track not only weeks-old summary data, but also financial positions accruing across many banks and market segments. That would shed more light on systemic risks—which, left unchecked, can bring financial calamity of the sort we’re witnessing today.
Any potent XBRL-based scheme to report risks, however, would require the reporting of daily financial positions, a major shift in how trading firms, hedge funds, and investment banks do business. To that end, Adler’s IBM Data Governance Council is spearheading a movement that would change how investment banks and hedge funds interact with regulators.
“At this point, everybody is aware change is coming,” Adler says. “And parties would rather be in the room together talking about common solutions.”
A speech Federal Reserve Chairman Ben Bernanke delivered last month shows him to be in agreement. Bernanke advocated taking a “macro-prudential” approach to risks that are “cross-cutting,” affecting many firms and markets or concentrating in unhealthy ways. It would involve “monitoring large or rapidly increasing exposures—such as to sub-prime mortgages—across firms and markets.”
You can read the full article here.
On September 14, David Bogoslaw published an article in BusinessWeek
entitled "How Banks Should Manage Risk." Rick Bookstaber and I are
quoted in this article because we first had an interview with David
following the XBRL Risk Taxonomy Meeting I hosted at the Levin
Institute in New York on May 13, and we had follow-up interviews two
weeks ago. As is the case in any press interviews, some of what you
say gets printed and a lot doesn't. In this case, I think much of the
substance of what I told David was out of scope for the BusinessWeek
audience and the goals of his article.
terms of a banking audience, David gets it all right, and I agree with
Rick Bookstaber's comments too. But what the article omits is the fact
that from 1999 to July of 2008 the US Congress, the White House, FHA,
the SEC, and the US Federal Reserve all participated in an
industry-backed weakening of the financial regulatory framework that
was built in the 1930's. In 1999, The Financial Services Modernization
Act (named Gramm-Leach-Bliley, or GLBA for short, after its authors)
removed 70 year restrictions on bank, investment bank, and insurance
cross-ownership. At the same time, derivative market oversight was
specifically excluded from GLBA and financial markets were allowed to
create and trade complex derivative instruments without regulatory
reporting or control.
In 2001, President Bush exhorted
Americans to "go shopping" to support the US economy following 9/11 and
the Federal Reserve obliged by cutting interest rates down to 1% to
pump liquidity into the US market. In 2004, Congress lobbied Fannie
Mae and Freddie Mac to relax underwriting guidelines on home loans to
allow sub-prime borrowers to participate in "The American Dream," and
own a home, and FHA provided loans subsidies to make it easier. In
2006, Congress pressured the same GSE's to relax underwriting on Alt-A
mortgages, allowing self-employed individuals to declare their income
with a signed affadavit instead of documenting their income through tax
filings. As I've written in past blogs, that change gave license to
mortgage fraud across the country as Alt-A borrowers could make wild
income declarations without validation and that pumped tens of
thousands of fraudulent mortgages into the global financial system.
This change wasn't reversed until July 2008, when the Federal Reserve
finaly changed Alt-A underwriting guidelines. The long tail of the bad
mortgages underwritten from 2006 to 2008 mean we will suffer
significant foreclosure rates welll into 2011, extending the depth and
breadth of this recession.
2006 proved to be the top of the
Housing Market in terms of house valuations and bank fees generated
from loan securitization and derivative markup. The pile-on
legislation and market encouragement from Congress, the White House,
and the Federal Reserve came from industry pressure to keep the party
going as long as possible.
Yes, Banks took on too much risk
from 2001 to 2007. But the US Government encouraged and enabled
excessive risk taking during that period, and both need to be monitored
to prevent future crises. There is an inherent conflict of interest in expecting the government that enabled the current credit crises to participate in the forecasting and prevention of the next one.
There is a history of financial
de-regulation followed by marked innovation and crash that goes back
100 years in the US. The innovation generates enormous wealth on Wall
Street and new tax revenues for Federal, State, and Local Governments.
The relationship between government enablement and financial innovation
was omitted in David's account and needs closer scrutiny because
policy-makers, and the public, will need new information management tools to realize the
impact of incremental policy decisions on financial market performance
over the longer term to be able to regulate wisely in the future.
the article, I recommended that the government create a new Regulatory
Information Architecture, modeled on the Information Sharing Councils
created by the Bush Administration for terrorism intelligence gathering
following the 9/11 Commission Report and the Intelligence Reform and
Prevention of Terrorism Act (IRTPA) of 2004. But more is needed.
year ago, I believed that new information technology and data
collection would enable the US Government to better analyze the
performance of financial markets and forecast potential bubbles and
crisis. I'm sure that enhanced information sharing in the US
Government will enable better regulatory enforcement, but it's not
enough to prevent future crises. The public needs to play a role in
the oversight process because the Government has its own interests
which are not always perfectly aligned with those of the public.
Administrations change, and with those changes come new philosophies of
governing and regulation, and in a Democracy like ours you always want
to enable others to regard and report information that others disregard
Therefore, what's needed is more information
transparency about market holdings and the actions of market
participants so that anyone in any firm, university, or industry
watchdog can analyze nearly the same macro and micro economic data that
federal regulators observe and make their own forecasts and
Without public access to better market data, we are just enabling government to encourage risk taking more efficiently in the future.
You can read the businessweek article here: http://www.businessweek.com/print/investor/content/sep2009/pi20090914_336015.htm
Frameworks freeze you in the past, by forcing you to interpret the present based on rigid formulas, interpretations, and even misconstructions. In 2007, the IBM Data Governance Council finished its Data Governance Maturity Model. Looking at all its imitations in the market, one could conclude that it has been remarkably successful.
However, as a benchmark of relative organizational maturity - and not just data management processes - I think its time has past and I'm working on new ideas.
I've written in the past about the loan origination underwriting failures that are at the heart of the current credit crisis. Market failures in Mortgage Backed Securities, Collateral Debt Obligations, and Credit Default-Swaps can all trace their lineage to high default and foreclosure rates resulting from those underwriting failures. In a piece I wrote in early 2008, I argued that simple changes in underwriting standards could have prevented the market meltdown.
I've also written about the relative efficiency of the Danish Mortgage Model and yesterday I heard an in-depth comparative presentation on that Model that I have to relate because it totally changed my point of view on the Danish Model. Up to know, I had seen the Danish Model as a business platform for mortgage processing. What I saw yesterday is a consumer solution with enormous political appeal.
The meeting was at the American Enterprise Institute in Washington, DC and the speaker was Alan Boyce, CEO of Absalon, the organization that exported the Danish Mortgage Model to Mexico. Alan presented the Danish Model in the context of what the Danes call "The Principle of Balance."
The Principal of Balance
enables borrowers to refinance their mortgages when housing prices go up AND sell their mortgage bonds at current market prices when housing prices go down to preserve their equity. In the United States, borrowers can refinance when rates decline and housing prices rise, but they have to suffer negative equity when housing prices decline. Housing prices often decline in a recession, and negative equity restrains labor mobility by nailing home-owners to their existing homes until prices rise and they can sell without a loss.
In Denmark, when recessions hit and housing prices fall, borrowers can sell their straight securitized bonds in a secondary bond market and refinance their mortgage at the current market price for their home. This flexibility protects consumers from negative equity and empowers workers with greater labor mobility.
From Alan's charts, Here is how the current system in the US works:
If Interest Rates decline:
- Home prices go up
- Homeowner can prepay existing mortgage
by refinancing at new lower rate
- Allows for equity withdrawal
If Interest Rates go up:
- Home prices go down
- Value of the mortgage (in a MBS) drops to
the holder of the mortgage
- Even though the value of the mortgage has
dropped, the homeowner still owes “par” –
the face value of the mortgage. He cannot
prepay existing mortgage at the price the
mortgage is selling for in the market
- ~$5 trillion is currently owed by
homeowners of non-agency mortgages.
These mortgages are valued by the market
at $3.5 trillion.
- In some of the hardest hit regions in the country home owners have lost their jobs and have negative equity in their homes, and they can't do anything about it.
Using the Principle of Balance
, here is how it would work:
If Interest rates decline:
- The system operates the same
- Home prices increase and people can refinance and take equity out
If interest rates increase:
I think this chart summarizes it best:
- Home prices go down
- Assuming credit worthiness, a homeowner
can prepay by purchasing back his or her
mortgage at the current discounted price
- This maintains equity in the home
- The key is new, standardized mortgage
This model doesn't perfectly preserve home equity as home owners will suffer some loss when housing prices decline, but the loss is substantially mitigated and this system offers individual freedom and choice. It is actually far more market oriented than the current US model.
In the US, we currently suffer 10% default and foreclosure rates, and there are an additional 15-20% who suffer negative equity in their homes but are not at risk of foreclosure. People in foreclosure can't take advantage of a new Principle of Balance Mortgage system, but the government can offer programs to restructure their mortgages at market value. Those with negative equity could be encouraged to migrate to a new Principle of Balance mortgage model.
This is an idea that has enormous benefits all around. It can help the Obama Administration reprice existing toxic assets. It can help provide more market-flexibility to home-owners. And it can repair confidence in the American mortgage market among investors world wide.
Who would have thought that market-oriented reforms would come from such a "socialistic" country like Denmark!?
I encourage everyone to read Alan Boyce's presentations and white papers. It is one of the most intelligent and easy to implement regulatory reforms I have seen in many years.
His full presentation: https://www.ibm.com/developerworks/blogs/resources/adler/20090325_1.pdf
His short white paper: https://www.ibm.com/developerworks/blogs/resources/adler/20090325_3.pdf
Over the years, the IBM Data Governance Council has had many international meetings:
- 2005 - Kronborg Castle, Helsingoer Denmark
- 2006 - Chateau Frontenac, Quebec City, Canada
- - Bucerius Law School, Hamburg, Germany
- - Hotel de Ville, Paris, France
- 2007 - Isola di Giorgio Maggiore, Venice, Italy
- 2008 - Kuala Lumpur, Malaysia
- - Zappieon Palace, Athens, Greece
- 2009 - L'Hermitage, Franschoek, South Africa
On January 21-22, 2010, the IBM Data Governance Council will be starting a chapter in Poland by meeting in Warsaw.
Around the world, Data Governance is in hot demand.