On September 20-21, IBM is hosting The Big Data Governance Summit at the Ritz-Carlton Bachelors Gulch in Vail, Colorado. Velocity, Volume, and Variety without Veracity creates Vulnerability.
This event is about Metadata, Stewardship, Security, Privacy, Data
Quality, and Big Data. We can reach to the skies, pull in petabytes of
relational tables, twitter feeds, video, audio, and documents, but its
all garbage in and garbage out without Data Governance.
knows this, and its our task to do something about it. We have to show
how it can be done – how anyone can build vibrant, dynamic, Big Data
Ecosystems that use common standards, ontologies, and methods to tag
huge volumes of data, index its value and context at high Velocity, and
search across its variety to discover trends with large clusters of
computational power that deliver high Veracity and low Vulnerability.
is the promise of Big Data Solutions, uniting disparate data sources
across our organizations, our cities, and our planet; leveraging data
sets based on purpose specification; searching for meaning and value
with brute force speed.
I can see this promise. Its within our
grasp. We can bridge our stovepipes of data and non-standard behaviors
into lean, mean, transformation machines that yield incredible insights
and informational power.
But this promise is only in reach with
Data Governance Solutions to provide metadata tagging, standards,
ontologies, purpose-based access protocols, audits, security &
privacy, data quality, discrete retention rules, and new tools and
technologies to automate how we do it.
The purpose of this event
is to explore how we can bring these ideas forward to help the world
adopt Big Data Ecosystems more rapidly, more successfully, more
We are meeting at the Ritz-Carlton Bachelor's Gulch,
which is the wonderful venue where we first shared the IBM Data
Governance Council Maturity Model with the world in 2007. We will look
at real life examples of firms using Big Data, exploring ecosystems, and
developing standards to model and simulate them.
This meeting is hosted by the IBM Data Governance Council but it is open to all.
Join us as we move forward Big Data Governance.
On Saturday, I took the train from Brussels to Cologne. The train is one of those modern ICE's - sleek, clean, quiet, and fast. The terrain through Belgium is hilly and the tracks pass over rolling fields, deep ravines, and wooded glens. As we neared the German border, the landscape leveled out and the train picked up speed, reaching 200 k/hr at one point. And as the small towns whisked by, I couldn't help think how magical it is to travel from Belgium to Germany via train with no border crossing and no passport control. It is so simple and easy, and without even a word you pass from one country to another.
This is a marvel of modern Europe, and it reminds me that the last 65 years are the longest period of peace in Central European history. Europeans have somehow, perhaps accidentally, realized a reality about modern warfare that has yet escaped the United States of America - modern war is Dumb Governance. For during the same 65 year period the United States has been involved in five large-scale wars lasting over 5 years each, 6 smaller military adventures, and of course one very long Cold War.
If you read my last blog post, you will understand my statement and reasoning that modern war is Dumb Governance. To paraphrase Von Clausewitz, war is the extension of diplomacy by other means. That is, it is an articulation of national policy - the communication of it.
Now back up a minute. If we have a policy that is communicated, according to the principles of Smart Governance it must also have had a decision-making process, some metrics and business case, hopefully either sustainable or situational goals, and some measurable results that we should care to compare to the goals.
In the old days, back before the industrial Revolution, it took 8 people working on farms to support two people working in cities. That meant that you had to have a lot of arable land and unskilled labor to support those cosmopolitan types in cities who made all the decisions. War then was one means to acquiring more arable land for civilized expansion. If you conquered more territory through war, you could expect to feed more city dwellers who produced more income via trade and crafts and that made your society wealthier.
In the early Industrial Age, this logic began to wane because industrial capacity isn't only dependent on land and labor. Its also dependent on capital, and capital tends to dry up when tanks cross borders. Of course, natural resources are also important to industrial economies. But warfare tends to be a fairly resource intensive activity so gains won on the battlefield can be difficult to hold and the net benefit of acquired resources can be undermined by the resource drain of battle.
In the Information Age, knowledge is power and both intellectual labor and capital flow so freely throughout the world that warfare gains on the battlefield don't provide sustainable balance sheet benefits. In fact, they are a net cost to any society waging war.
Think about it for a minute. On 9/11 the World Trade Center was blown up by heinous terrorists based in Afghanistan. Immediately, the United States sent 30,000 troups to invade that country. The stated goal of this policy was to protect Americans from terrorism. The measured need for the policy was the attack on 9/11. The policy decision was made by the President of the United States with full support of Congress and the American people. The policy was communicated with 30,000 American troops and a good contingent from international allies.
And the outcome? Eight years later, we are still occupying one of the poorest countries in the world with over 60,000 troops. Afghanistan is not even a real nation in modern terms. It is a tribal collage of small warlord controlled fiefdoms. Pakistan is barely a modern state, and Afghanistan is 40 years behind Pakistan. Kabul has 4 million people and 95% of them have no running water in their homes. The GDP is only $12.8 billion. It has no agriculture, no industry, few natural resources, no significant knowledge resources.
The war in Afghanistan has cost US Taxpayers $172 billion to date. That is 13 times the GDP of the entire country. We are spending more each year to wage war than Afghanistan is even worth.
Compare the Outcome to the Goals. From an economic perspective, it's a huge loss.
War today is a net economic loss for any country that wages it. Resource control is simply not worth the costs. The Europeans have figured that out. That the US could learn the same lesson...before we bankrupt our nation through warfare...
Winter and I have arrived in Warsaw. It is November 9, 2009, twenty years to the day since the Berlin Wall fell and I am in a gorgeous Hilton Hotel in the city that still has the scars of the 20th Century written in it's streets. The trees are bare here, the temperature hovers around 5C. A foggy rain shrouds the city. All around this hotel are the scarred foundations and empty lots from the Jewish Ghetto, destroyed in 1943 by the barbaric Nazis.
It's an eerie feeling, but this day is like any in Warsaw. My hotel is full of professional wrestlers and their groupies from a large match nearby. The lobby has men with necks the size of my waist. They are drinking at the bar, flirting with women drawn to the spectacle, and loudly proclaiming their happy personalities.
We remember this day as the end of tyranny in the East, the final chapter in a 50 year book of horror that began in 1939. Lucky those alive today who don't have to remember.
Last week I hosted a Data Governance Executive Breakfast for 20 CIOs in Warsaw Poland. It was my first trip to the Iron Curtain Capital and I expected a concrete grid of grim apartment complexes and monumental communist office architecture. Instead, I found a lovely city still working hard and succeeding to erase 50 years of Nazi occupation, annihilation, and communist oppression. Warsaw today is a gem of a city, with warm and friendly people, beautiful architecture, an eager business atmosphere, and a deep historically rich intellectual tradition.
My one day in Warsaw was graced with gorgeous weather and a terrific morning event that combined both Data Governance content and XBRL. My partner in the Breakfast presentation was Michal Pienchofsky from Business Reporting AG, a Data Governance Council Member specializing in XBRL consulting who is based in Warsaw. Michal gave a terrific presentation linking Data Governance goals and structures to XBRL taxonomies, regulatory compliance, and business optimization.
After the event, I met an old family friend who lives in Warsaw. Stacy is the father of my brother-in-law, and in the summer of 1944, at the age of 16, Stacy joined the Warsaw Uprising and fought against the Nazis. It was a valient and tragic effort that for three months engaged German units in a bloody campaign to win back the Polish Capital. The effort was largely unassisted by both the Americans and the Soviets - who were actually sitting outside the city some 11 miles away and waited for the Germans to mop up the resistance before liberating what was left - rubble - of Warsaw themselves.
It happened that this summer marks the 65th Anniversary of the Warsaw Uprising, and Stacy took me on an uprising tour of Warsaw, showing me the manhole cover where he entered the sewer to cross the city underground to evade Nazi patrols, the intersection where his Gozdawa Battalion setup a barricade, the churches where Nazi tanks hid in waiting, and many walls where bullet holes and plaques still mark the spots where thousands of Polish Civilians were executed by the Nazis in reprisal for the uprising.
We visited the Uprising Museum, which is a fascinating and well done museum documenting the events of the uprising. They have the B-25 that the Polish Government in exile used to send supplies to the resistance fighters, replicas of the sewer pipes that you can walk and crawl through to get an idea of what it was like - without the sewage - and many photos detailing the grim battle and the utter destruction of Warsaw afterwards. The Nazis leveled the city after the uprising was crushed as an example to any other nation that wanted to rise up against their tyrannical rule. Not one building, not one facade even, was left standing in the city.
The lovely inner city that one sees in Warsaw today was complety rebuilt by the Communists after the war. I've been to Prague many times, where it is often remarked that the old city was preserved after the war because the Communists didn't have the money to put up new buildings. I think Warsaw demonstrates the lie of that assumption. Communists obviously love good architecture and cultural heritage as much as Capitalists do, because they did a marvelous job restoring Warsaw to its some of it's pre-war splendor. There are still many sites outside the inner city where scars from WWII are visible. I haven't seen that in other WWII sticken cities, like Hamburg which was 80% destroyed by allied bombs in WWWII. Just across the street from the Hilton Hotel where I stayed there were empty lots and war ruins of buildings, which is quite amazing in the 21st Century.
But in the 20 years since the Iron Curtain has come down, already there are many modern changes to Warsaw and I can well imagine that this city, with its great people, and hunger for innovation, and rich traditions, will regain its former glory as a great city in the 21st Century.
I posted some photos I took while in Warsaw on Picasaweb. Have a look if you are interested:
It was a great trip business-wise and it certainly demonstrated the resilience of the human spirit even under the most barbaric forms of oppression.
Last week I was in Hamburg Germany teaching my Data Governance Course with my friend Christa Menke-Suedbeck at the Bucerius Law School (www.law-school.de). One evening, I went to the Deichtorhalle to see a lecture from one of my old Hamburg friends, Tom Holert. His lecture was part of a larger panel discussion on modern photography, focusing on the 19th Century artistic techniques photographers use to "stage" their photos, blurring both photo journalism and art.
Tom Holert is one of Germany's most prolific and well-respected art and music critics, and his presentation left me deeply concerned. Have a look at this photo that Tom presented. It is by Eric Baudelaire and it is called "The Dreadful Details."
At first glance, it looks like any other horrific photo from the Iraq War that we have all become uncomfortably comfortable viewing. These depictions of fear, death, power, and dread no longer shake the subconscious as they once did during Vietnam.
However this one should, because it is entirely fake. The photo was taken on a Hollywood backlot and all the people in it are actors. It is a modern example of staged art to resemble reality. But you will think it is reality unless you are aware of this context - unless you know the provenance.
Yesterday, the NY Times reported that 16 retired US Generals working as military analysts on Fox, ABC, NBC, CBS, and CNN News had been secretly collaborating with the Pentagon to shape US public opinion in the most brazenly documented form of government propaganda I have certainly seen in my lifetime. These military analysts acted as supposed "impartial" experts on network TV but in reality were toting the Pentagon public line.
Again, staged reality. We all believe this stuff unless we are informed of the context, the provenance of this data.
Whether you support the war or not, you have to recognize that these distortions have an enormous cost. A recent Harvard University study put the direct and indirect costs of the Iraq War at $1 trillion, a figure I'm sure the Pentagon has already developed analyst talking points to refute.
Of course the loss in human life on all sides of the conflict outweigh the economic costs. But I would argue that the most enduring damage is in the global brand of the United States of America, whose public image around the world has been so tarnished.
In the Information Age, we are all victims of Toxic Content in our data streams. This Toxic Content makes the truth the most endangered data asset in the world and I fear that what we have lost so far we will struggle to regain against new data distortions that will make old lies seem like quaint nostalgia.
And if you think propoganda only invades public data streams, read my articles on Subprime...[Read More
In yesterday's Financial Times, Hank Paulson, the former Treasury Secretary, wrote an article entitled "Reform the Architecture of Regulation." In the article, Hank blames inadequate regulatory authority and overlapping jurisdictions for the failure to forecast and prevent the current credit crisis. He recommends an ideal regulatory infrastructure composed of three agencies: "one charged with maintaining market stability across the entire financial sector, one for supervising the soundness of those institutions with explicit government support, and one reponsible for protecting consumers and investors."
Hank wants the Federal Reserve to have the systemic risk authority in the first case. He wants the Fed "to have access to information from a broader set of financial organizations, including hedge funds and systemically important payment systems. This authority should also have the power to intervene if it concluded that the financial system was at risk."
He goes on to say that both the Treasury and the Fed lacked appropriate powers to allow Lehman Brothers to wind-down in an orderly way, and of course that might be true but the Lehman failure was not in the lack of orderly bankruptcy. The failure was due to Hank Paulson deciding to let Lehman brothers fail in the first place. AIG, Fannie Mae, Freddie Mac, Bear Sterns, Merrill Lynch were all too big to fail, but Lehman wasn't. When the government picks winners and losers in a time of national crisis, the public is the ultimate loser.
It must be nice to look back on past failures and propose future solutions, but Hank's analysis omits too much.
1. This crisis was preventable. The government had the data, and the Federal Reserves own economists admit that in a report written in October 2007,
"Were problems in the subprime mortgage market apparent before the actual crisis showed signs in 2007? Our answer is yes, at least by the end of 2005. Using the data available only at the end of 2005, we show that the monotonic degradation of the subprime market was already apparent. Loan quality had been worsening for five consecutive years at that point. Rapid appreciation in housing prices masked the deterioration in the subprime mortgage market and thus the true riskiness of subprime mortgage loans. When housing prices stopped climbing, the risk in the market became apparent."
2. The government is in fact awash with the right data that provides leading indicators about many aspects of the economy. Individual agencies collect that data and either do not understand it, do not compare it across companies or geographies, or do not disseminate in an intelligent manner.
3. In some cases, the government in fact lacks important data that can yield indicators of systemic risk, but initiatives like the XBRL Risk Taxonomy can remedy these deficiencies quickly.
Changing the organizational structure of financial regulation will be disruptive and will not necessarily produce better results. Existing structures with new authorities can produce the same or better results.
But the government does need a Business Intelligence strategy to make better use of the information it already collects, and integrate new sources effectively. This isn't just about reporting standards. Every day, auditors request information from financial firms. Regulatory auditors are often camped in regulated firms for extended periods. They collect all kinds of important data about business practices, assets, liabilities, and losses. Where does this information go? Is anyone collecting it using standard practices and integrating the structured and unstructured content into comparable repositories? Can anyone compare practices from firm to firm without having participated in site audits?
I suspect the answer to these questions is no, and we are all witnessing the results of the governments' failure to use its own information intelligently.
In any administration, pro or anti regulation, it will always be in someone's interest to disregard information that doesn't fit their philosophy or needs. The only safeguard in a democracy is information proliferation to many diverse interests. This enables others to regard information that others disregard, to glean meaning that others miss.
Hank Paulson is wrong. The government doesn't need a new regulatory architecture. The government needs a new Regulatory Information Architecture.
Tim Geitner is on Capital Hill today asking Congress to provide regulators with new powers to control the derivative markets. He claims that derivatives blindsided the Administration and nearly destroyed the world's economy. Congress, by all accounts, seems willing to provide these new powers to both the SEC and the CFTC, which will include the power to collect positional data from key broker/dealers and enforce positional limits on trading to constrict bubble formation. These are good ideas. But these powers alone won't fix the current problems in our economy or prevent future financial catastrophes. They are at best solving a symptom of the credit crisis, not the source problem.
The source problem was bad home loan mortgage underwriting, and those bad loans continue to produce 12% foreclosure rates that have not abated. That problem was the result of misguided policy mistakes by Congress and FHA in 2004-2006 that were not corrected by the Federal Reserve until June of 2008. The tail of those bad loans still haunts our mortgage market. The non-GSE mortgage market today is effectively dead. Anything without a federal insurance program isn't being underwritten, and none of the policies of the Obama Administration have changed the nasty state of foreclosure nationwide.
Rising unemployment across the country is adding to the delinquency and foreclosure rates. Tinkering with the derivatives market is a nice sideshow. But until the Obama administration gets serious about mortgage reform, they are only addressing the symptoms of our problems not the core.
A colleague wrote to me today with the following question:
"Would you be able to point me to information that describes what can make a data governance effort fail? Points about technology problems, organizational politics, a lack of organizational leadership, etc. come to mind. Are you able to expand this list or explain how to avoid failure in a data governance program?"
I get asked this question often and my answer is an easy elevator pitch for anyone looking to explain Data Governance:
"Two things always lead to failure:
- Lack of outcome oriented programs
- Lack of power and accountability in the data governance board.
Of course, no one has a monopoly on screw-ups so there are innumerable additional options for modest and wholesale disaster in any organization run by human beings. Important thing is to be ready for failure and catch them when they are small, institutionalize the learnings, adjust and move forward."
People who say that failure is not an option are dangerous fools. Statistically, failure has the same odds as success, and if you plan for both your program will grow in wisdom and effectiveness - which is the very best from Data Governance that we can all plan for.
Modified on by DataGovernor
64 years ago, when my house was built the Long island Power Company installed electric meters in my basement. Two large grey metal meters are affixed to my foundation with insulated wires connecting them to my fuse box. They have a variety of dials and arrows beneath thick glass. I can see the meters but don't really understand what the numbers mean or how to read their information. Every other month or so, a polite person from the power company rings my bell to ask if they can come into my home and walk into my basement to read my meters and input the results into a handheld device that radios the information back to the power company.
The last time the meter reader was here I asked why the power company didn't trust me, the homeowner, to call in the numbers or input them in a form on the internet. She told me that many people don't understand the meters and if they do often lie to the power company about what they read to under-report their electricity usage. I asked why the power company couldn't read my meter remotely or why they couldn't measure how much elecricity my home was using.
Like, "don't THEY know that?" Nope. THEY don't, and the reason they don't has as much to say about how the electricity grid works as it does the way all complex modern industrial systems work and we in the Data Governance world can learn a lot from this.
The electricity grid was created as a downstream electrical production network. Upstream Power plants create electricity and send it downstream to factories and homes to be consumed. The power company did not build in mechanisms to measure how much electricity is being transmitted over the wires, how much is being consumed, and exactly by whom. Your monthly electrical bill is not based on your actual electricty usage. Its based on estimates of your useage based on your historical usage information. That is, the meters read your past and the power company forecasts your current usage and future performance based on that historical information.
The grid itself is run at 70% of electricity capacity to allow up to a 20% margin of error. If the lines carry over 90% of their rated capacity in aggregate, some lines could be running at 100% and therefore could overload and explode. And if some lines overload, capacity reroutes and burns up other lines, transformers, and sub-stations,. So the whole system is calibrated based on historical analytics. The power producers have no realtime understanding of how the electricity is used, in what quantity, when and where. And even the end users don't really understand where the electricity came from, how it was produced, or how the system actually functions.
In the Industrial Age, we human beings created many complex systems that function without many of the system participants understanding how the system works, and this is fine if we are all happy running our systems at 70% efficiency.
In enterprises today, we run our Data production and consumption systems with similar levels of complexity, performance, and ignorance. Most business users have no idea where the data came from, how it was produced, transmitted, and consumed. Conversely, most, if not all, Data Governance professionals have no idea how business people collect and use information to generate value. And this grid was created without any meters to read data volume, velocity, veracity, and utility.
Councils, Stewards, Policies, and Standards will improve human communication about the importance of data in an enterprise, but they won't change human behavior over time without new Data Governance "Smart Meters" that measure and report how data was created, who refined it, how it was transmitted, aggregated, repurposed, criticized, commented, stuffed in envelopes, posted in trades, hedged in inventory, reserved against premium, debated in legislation, trademarked, copyrighted, patented, packaged, and a million other uses and abuses. Until we can demonstrate a clean line between creation and use, Data Governance will be two steps forward and two steps back over and over again, generation after generation.
We need meters and readers and a new Information Age infrastructure that tells us, intelligently, what we are doing, why and when we are doing it. It should connect maintenance to operations, front office to back, middle to board, outside to in. We don't know enough today to tell regulators what we know and until we do we won't be able to close the gap between our forecasted capacity, current and optimal states.
The Information Governance Community is running a landmark Survey on the Cost of Data Quality and the #1 answer to all of the questions is "I don't know." Data Governance Professionals don't know how poor Data Quality effects Business Outcomes because they don't measure that. After Lehmen Brothers disintegrated in 2008, and the global financial meltdown spun out of control, the number one question from the public was "Didn't they know this would happen?"
No. No one knew then. No one knows now. And on one will ever know until we build more intelligent systems that connect Information Production to Consumption and measure the gaps every step of the way. This last recession has demonstrated that we are reaching the limits of our unintelligent Industrial Age networks and systems and its time for a major upgrade.
The Subprime Credit Crisis is not even half over, but one thing we should have all learned by now is that Banks that paid attention to Data Governance lost the least. Whether they were able improve decision-making with better quality loan origination data or calculate risk with enhanced x-functional tools, some banks had better operational programs and show better returns.
In the wake of Subprime, fund managers should be asking companies they invest in about their Data Governance practices. Questions like:
1. Do they have a DG Organizational Structure that creates enterprise policies and reports results to the Board of Directors?2. Is there a Stewardship function that assess data quality on an ongoing basis and works to improve operational decision-making with high quality data?3. Are Data, Security, and Risk functions working together to maximize internal intelligence about operational, credit, and market risk controls?4. Can the organization calculate risk and forecast potential losses?
If a company today can't answer these questions intelligently, they are not governing their information assets and the market should represent those management failures in share prices.
Data is the raw material of the Global Information Economy. Companies that govern its uses well will demonstrate better bottom line performance. Companies that don't carry an investment risk. It's that simple.[Read More