Adler on Data Governance
Since the 18th Century, Freedom of Expression has become enshrined in constitutions around the world as a Basic Human Right. It defines Democracy in its defense and Dictatorships in its assault. People like to control and don't like to be controlled, and the tension between controlling and being controlled requires this Human Right to be defended and re-defined every year. Sometimes, like during the McCarthy Era in the United States, the tide turns against Freedom. Other times, like in the Middle East today, the Freedom to speak changes the course of history.
But there is another Freedom not yet defended as a universal Human Right that should be and it is the Freedom of Information - the right to be informed, to learn. This right is implied by the Freedoms of Press and Speech, but it is not articulated explicitly as a constitutional right. Around the world, many nations have Freedom of Information Acts that require national and local governments to make information available to the public. Those acts were created when widespread access to information was rare. Libraries and archives were places where large amounts of information could be physically retrieved and governmental disclosure was paper-based. Universities and Governments were the largest aggregations of information, and they were the places you visited to get information.
But today, with the Internet, human beings have potential access to information without physical limits and it is that potential that must be enshrined in law as a basic human right. Every human being on the planet should have the right to access information freely and without threat of harm. Like Free Speech, that right should be defended even when the content of information accessed are heinous and injurious to some. Any society or nation without the Freedom of Information as a basic human right is a place that can be controlled and manipulated.
According to Human Rights Watch, there are 40 nations around the world that restrict access to the Internet or Social Networks. Many of these nations also block satellite TV and other forms of communication. But even in Western Democracies, Information Access is controlled by cost, technology barriers, labor protections, and secrecy laws. Even the most advanced nations have huge regions without access to the Internet. And some nations now seek to tax content flowing over the Internet as a means to restrict trade and favor local providers.
This is not a question of commercial competition. This is a question of human progress. Where there are people unable to access information freely there are opportunities for oppression and abuse. Democracy and Freedom will not thrive or survive without the Freedom of Information. To be ill-informed and speak freely is a condition of intellectual slavery.
I believe that we must work to assert the Freedom of Information as a basic Human Right. It must be a 21st Century Goal to connect every human being on the planet to high quality trusted information. There should be no technical, political, cultural, or economic barriers to Information.
It should be as easy as air and as cheap as water, taken for granted and governed by statute in every nation around the world.
This morning, EU Regulators announced that they propose to create a Risk Board to monitor financial market performance and systemic risk indicators among the 27 member nations in the European Union. I've advocated a Council approach to risk-based decision-making since the beginning of this year and I think the EU proposal is a good idea in concept. Unfortunately, in Europe it seems decision-making takes a large number of people, becaue the European proposal would have 63 people participating on the Risk Board. A deliberative body with 63 people is not a "Board" - it is a legislature. To complicate matters, "only" 32 members of this board would have voting rights. Unfortunately, the only power they can vote on is a warning to member states that some part of their market performance contains systemic risk. How they plan to determine that threat and get everyone to agree on what it means in any reasonable amount of time is not clear. My guess is that this is a proposal to setup an intra-governmental think-tank that will study issues, write economic reports that no one reads, and only threaten to issue warnings because a vote on a warning will never happen.
Note to Obama Administration: If you want to create a Systemic Risk Regulatory Structure that is guaranteed to fail due to political indecision and lack of authority, copy the EU model.
Last night, I was one of two panelists at a Global Association of Risk Professionals (GARP) symposium on Systemic Risk at Fordham Business School in New York. We were to be a moderator with three panelists, but one canceled at the last minute, presumably to stay home and watch the Yankees lose to the Phillies last night. The room was on the 12th floor in a mid-60's squat tower accessible from two elevators among a bank of six in the stone cold open and office-like lobby. Twelve is the top floor in the building, with a Rockefeller penthouse atmosphere. Black marble floors, mahogany paneling, subdued sixties swank.
The symposium room was longer than wide, seated classroom for one hundred in three neat blocks. We panelists were paired on a white-clothed-table with microphones we didn't need. The moderator introduced us both; the NYU Business School professor and the IBM Data Governance guy. The audience looked half-asleep, and the first question rolled out on the table, "What is Systemic Risk?" Our gracious moderator had prepared a raft of intelligent questions for us that evening, but we would only get through two in the brief hour we digested.
What is Systemic Risk? The professor told us it was the result of exogenous market conditions that created upper atmospheric bubbles in complex derivative instruments capable of devastating global economies. It could be measured in the up and down-swing of aggregate equity performance and controlled through the central banks he currently advises. He saw Systemic Risk as a macro-economic phenomena, the product of weak government regulation, greed on Wall Street, outrageous compensation packages, and unnecessary complexity in financial markets.
Before the event, I wasn't quite sure what I was going to talk about. It was a hectic Monday full of ten conference calls on twenty different topics. I left late, had traffic on the Grand Central, got lost at Lincoln Center looking for parking, and there was no coffee when I arrived. I'm not an evening person un-caffeinated, and perhaps not the best morning person in the same condition. But droll media babble passed as tenured professorial wisdom will rouse me on the sleepiest of days.
Systemic Risk is the probability of loss to a system. It is not actually a thing that can be calculated. It is a series of things that result in a loss event with causality and impact. Systemic Risk is not only about macro-economic catastrophe, because to say so is to say that we are not involved in Systemic Risk accept as victims. And that ain't true. Insofar as all of us, The People, are members of communities, parties, religions, nations, and environments we are part of a System. We are inter-related, inter-dependent, capable of causality, errors and omissions, losses and claims. Each incremental failure can cascade and result in systemic exposure.
The Credit Crisis is the result of a series of public policy mistakes from 1999 to 2006 that encouraged bad business practices at many different stages of the mortgage underwriting and securitization process. These were incremental failures that contributed to loss events that destroyed parts of the economic systems upon which markets rely. The lesson to humanity from this experience is that We The People are all members of SYSTEMS large and small that can fail as a result of incremental policy mistakes. Actuarial Science has for too long focused on the probabilities of contained loss events.
My body is a SYSTEM and Cancer is a systemic risk to me. It causes a chain of events which can result in organ failure and death. Your company is a system, and bankruptcy is a systemic loss event. If bees die, plants won't be pollinated, and that can be causality to a systemic risk to our ecoSYSTEM. The BBC Reports (http://news.bbc.co.uk/2/hi/science/nature/8338880.stm) that record numbers of plants, mammals, and amphibians are under threat of extinction. This is a systemic risk. When entire species of frogs in remote places like Tanzania become extinct in the wild, humans take note - this incremental failure is closer to your role in the food chain than you may think.
Every System has risk. Every person in every system has a role.
If we accept the gossip-press gospel that the Credit Crisis is purely the result of greed on Wall Street, and can only be fixed by wise regulators in Washington, shame on all of us for missing the opportunity to internalize the economic externalities. It is not an academic exercise to study the risk in every system large and small. Systemic Risk is a real-world imperative for all of us.
The Winter Solstice is the time for Data Governance Predictions. And here are mine for 2011:
1. Systemic Risk Councils will proliferate. The Dodd-Frank Bill established a Systemic Risk Council in the Federal Government to aggregate financial data from across the economy to detect patterns of exposure that can impact macro-economic policy. All Financial regulated entities should follow the leader and do this themselves. Some, like JPMC and Goldman Sachs already do this. Everyone who is not doing it should get on the wagon and replicate.
The Federal Government will take eons to gather all the data and make sense of it. And even if they do it, their will be political considerations with regards to how the data is used and disclosed. And forget about counter-cyclical policy-making. So if you want your firm to escape financial ruin in the next Sub-prime, Sovereign Debt, Greek, Irish, Portuguese, or Spanish Debt Crisis, go and get a Risk Council and start sifting the data yourselves. Processors and storage are cheap, data is widely available, what you need is the organizational structure, decision-making system, and a sound Data Governance program. Get it going now, because with all the debt the world has accumulated there will be many more crises to predict.
2. Health care will join the Information Revolution - Today, many doctors use the Internet to look up symptoms, anatomy, and, of course, pharmaceutical remedies. Yet as an industry, there are so few information resources that document the comparative performance of doctors and hospitals in how they treat patients and the results. In 2011, thanks to US health care reform, this will start to change and I foresee a nationwide movement to aggregate vast amounts of health care data to analyze and report on what works, what hurts, and start building plans to make care more efficient and more effective so that people live longer. Data Governance will play a huge role in this effort, which will start next year and consume the next decade.
3. National Incident Detection - Like it or not, the days of the Internet Wild West are numbered. While the new Republican Leadership in the House is opposed to the Net Neutrality Bill, it seems certain that some form of national security oversight over Internet incidents and threats is going to happen. The government has been trying to corral business into sharing incident information since 9/11 and I predict they will succeed at some point because nation-sponsored cyber-warfare can not be resisted by private enterprise alone. In some as yet to be determined form, new information sharing regimes will need to be designed that aggregate threat information from businesses across the nation to develop early warning systems and protect national Internet assets.
4. Self-Governing Commons - Human beings can, in fact, govern the use of common resources more efficiently than hierarchical or proprietary solutions. The Information Governance Community is a demonstration of this fact, and in 2011, similar demonstrations will proliferate around the world and Social Networking itself will mature into online meeting places where people do more than talk - they will govern themselves to produce common work products. An aggregation of people without a deliverable is a media channel. Those same people collaborating on common ideas to produce work are self-ruling corporations and this phenomena will change how people are organized around the world. Any idea or project can be accomplished by self-organizing groups of people with common interests, a governance model, and an incentive structure designed to produce an outcome to effect change.
Five years ago, we formed a Data Governance Council to change organizational behavior and effect change. Achieving Semantic Consistency, Data Quality, Single Views of the Truth, Trusted Information, and Security & Privacy are all IT goals necessary to achieving any one of the above Predictions. Information is changing the world and with information we can change ourselves. However, without Governance, all we have is Data Management and none of what I described above is possible.
My aunt Helen had an opinion on everything. She was an information junkie long before the Internet, consuming at least three newspapers a day and watching untold hours of news television. If she didn't know about an issue directly, she had enough reference points to issue an authoritative opinion. I spent many weekends in her ancient Cheshire farmhouse with the musket holes in the foundation to protect against indian raids and the secret spot behind the fireplace where slaves hid in the 1850's Freedom Railroad on their way north to Canada. Dusty newspapers from the 1960's clogged the front staircase that was never used. Every National Geographic since 1940 sat piled in closets and behind sofas. Photos and postcards sat in boxes everywhere. Nothing got thrown away. Even the dust had dust. Her home was a database, and her brain was the ultimate computational instrument, an informational repository without parallel in our family.
Helen's knowledge of the world seemed to extend way beyond the bounds of her 1730 home. When I was young, I sat in awe of her voluminous and expansive mind never daring to question or challenge any of her positions. But as I grew into adolescence I began wondering if some her statements weren't maybe a little made up, or at least extrapolations of things she knew into things she thought she knew or could know with just a little imagination. But woe to you if you challenged her without some backup because she sure did know a lot and her mind was so sharp you could be reduced to blabbering in a microsecond if you really didn't do your homework and researched a topic.
But when I got to about 20, attending college - the place you went to get important information before Google put it on our smartphones in the subway - I started to learn that lots of what Helen said wasn't quite the way she said it. It wasn't that it was completely wrong, its just that it wasn't really always black and white the way she presented it. There were lots of different ways you could see and interpret the information. And you could construct a perfectly valid and well thought out argument that tied her up in intellectual knots. And back at the farm that summer we had some great arguments. Fact is, Helen was often at least partially right and wrong about a lot of things. Not philosophically wrong, because that's a matter of belief.
Factually in error, but never in doubt.
Her conviction was the secret of her intellectual strength. We've all known people like Helen, and many of you who know me are probably already murmuring "ahh, that's where he got that..." But I didn't bring up this point to wax about my family heritage or personality. I brought it up because this characteristic is one we find every day in our organizations, in the newspapers, on the web, in our governments. People develop points of view and stick to them, and getting people to see beyond their point of view is really a challenge. It isn't that the information is wrong, its that the people interpret it the way they see the world.
Information itself is a human creation. The computer didn't put it there. It isn't immutable, dirty until cleaned, chaste, pure, imperfect until perfected. It is a reflection of us, and since we created it, its sometimes wrong or the truth is at best a mixed result.
But what's to blame for that? Your Metadata? Your Business Glossary? Data Architecture? Security & Privacy? Audit? Your Organization?
YES! All of the above. Everyone who creates and uses information is involved in its interpretation and implementation. You don't have to be a data architect to influence the way information is used in an organization. Any iPhone or Android user has a role in the information management today. Bloggers, vloggers, and photographers shape and shade their creations to effect a mood, sell a product, influence an outcome. Everyone with a data connection is a source and a target and we all must accept responsibility for how we govern the use of OUR information.
Those consultants who tell you how to "govern the data" with all those tools are not helping anyone but themselves. Tools like Business Glossaries, Metadata workbenches, Master Data Management, Data Quality Profiling, and Audit help us understand when our information is out-dated, inaccurate, partially true, or just plain boulder-dash. We use those tools to illuminate the dark corners where opinions and habits force difficult debates to unlock the truth because we know that Information is the only tool we have to change behavior.
Want to succeed with Information Governance. Get Aware. Know what's happening and share it. Use your Information Governance tools to build operational awareness.
People will change their opinions when confronted with a solid argument, and that's what you want - Change from Information.
Fact is, I learned a lot from my aunt Helen and I still hear her voice strong as ever. Sometimes wrong, never in doubt.
IBM has been at the forefront of the Information Governance movement since the formation of the IBM Data Governance Council in early 2005. For the past six years we've worked closely with industry-leading companies from around the world to tackle the biggest challenges associated with governance.
Around the world, our clients are at varying stages of recognizing the necessity of Information Governance and implementing guidelines, standards, and policies. If your or others at your company have started conversations on this topic, then this event is for you!
We would like to invite you, and 2 of your colleagues who are information stakeholders in your company, to participate in a workshop that will help you build an effective information Governance program:
- Define your needs
- Benchmark your organizational maturity
- Define your organizational structures, methodologies, and tools
- Develop new insights and build a system for information Governance
In this hands-on workshop, participants will be taken through four of the Information Governance capabilities, and asked to rank their organizations according to maturity level defined in the Information Governance Maturity Model. All rankings are confidential and you can take home what you start and complete it later with your colleagues at your convenience.
Who should attend:
- CIO and senior IT Exectuvies
- Business Analysts and subject matter experts
- Executives involved in compliance and data protection
- Data or Information Stewards, Directors of Data Governance, and Data Architects
- Consultants and IBM Business Partners
The goal of this workshop is to educate and improve. Participants will meet other practitioners and gain valuable insights through comparative discussions of common challenges. New insights will be shared with the global information Governance Community, inspiring new ideas and topics.
I hope to see you there.
IBM Information Governance Workshop
Manila is an ancient Spanish colonial city with American influences and a culture all its own on the rim of Asia. It takes several visits to appreciate that despite appearances and a host of American shops, businesses, and call centers, Manila is not a larger Honolulu, and the Philippine people are not just nicer Hawaiians. The culture, like the heat, is soft and pervasive and gently unique. The foreign influences, like the rain during the early June rainy season, hide behind clouds.
Two weeks ago I made my third trip to Manila, and hosted a Data Governance Council Maturity Model workshop in a modern hotel conference room for 25 customers spread across 10 tables of round. In my 8 hour presentation, I integrated the Maturity Model into the Six Steps to Smart Governance using both OpenOffice and the IBM Application Roadmap Tool (ART). Customers used laptops with the ART tool running to score their respective levels of maturity and I explained how the Maturity Model provides benchmarks to assess current and desired states of Maturity from which the Six Steps can be used to govern the use of data in a more scientific and repeatable way.
I've given these two presentations often, mostly in shorter conference presentations, but at least 12 times a year if not more. I constantly update my presentation with current examples and anecdotes to keep the material fresh but also to keep myself fresh and avoid the self-boredom of redundancy. But to each new audience, the material is fresh and I'm always amazed at how the Maturity Model transforms conversations from abstract theory to relevant practice.
I present five to seven charts then go to the ART tool and we run through three to six sub-categories of the model. Organizational Structures/Summary, Data Quality/Processes, Stewardship/Accountability, Risk Management/Accountability. During these phases I read the content for each level of Maturity and simulate a to-be and desired state by moving the slider bars over. Most of the audience hears my words and ignores my gestures. They are engulfed in a personal assessment of their own Data Governance maturity. Huddled over the laptops, they discuss their perceptions of the model levels, argue about what the terms mean, relate the observed behaviors of 50 companies in North America and Europe to their own habits.
It is fascinating to watch! They don't want to move forward to new categories, as each level brings forward painful memories of immature practices, problems long festering needing change, and the re-awakening that they too are immature and can change with an external assessment.
Four years after its creation by a group of 50 visionary Data Governance Council members, the Maturity Model still inspires and provides fresh evidence of its value and relevance. It excites audiences all across the world, and as a benchmarking tool there is no comparison. Every time I do this I wonder to myself how this material can excite as it does. But it is the common awareness of ad-hoc, episodic, IT adventures, crises, and budget constrained fixes over decades that motivates people to realize that their situations are not unique and that only systemic solutions will work.
After all these years, Data Governance is a real global market and the real work to make it a success just now begins.
Thank you Manila.
In 2004, when I hosted the first Data Governance Forum at Mohonk Mountain House, I had three teams of IBMers developing the narrative discussions for three tracks on a common use case. The tracks were called "Infrastructure," "Policy," and "Content." The use case was "Data Supply Chain." The Forum had two days of meetings stretched across three, starting in the afternoon of the first, going until lunch of the last. On the only full day, we hosted the three breakout meetings, and each team worked to integrate their track discussions around the use case. The use case came from some business process definitions software group had developed for business component models, something to do with insurance claims processing. As it turns out, we had only one or two insurance companies at the event, and we spent more time focusing on the track headings and business process model than on the idea of a Data Supply Chain. A conference is always the product of the people and the ideas in a room, regardless of what one puts on the agenda. And at this first event, when most of us only had the most vague understanding of what "Data Governance" was or could be, business processes were familiar and Data Supply Chains were distant.
Three weeks ago, I hosted another Data Governance Forum at Mohonk Mountain House. It was again two days of content stretched across three, and again a very diverse group of people came together to produce discussions that were engaging, powerful, and divergent from what was planned on the agenda. In three breakouts on "Data," "Risks," and "Governance," the panelists and audience exchanged ideas and I ran back and forth between the breakout rooms to listen, learn, and occasionally drive the conversations. What I heard among talks about Data as an Asset, Risk Taxonomies, Governance models, and Security & Privacy, was the loud echo of Data Supply Chains reverberating off the walls. It was like an archetype of the first meeting, the temporary suspension of historical time, as if in all these years of Data Governance we had lost the original truth, like a spring disappeared under the ground, rediscovered at the source.
Every company does Data Governance today, for ill or good, with intent or dystopia. Every company also has at least one, but often many more, supply chains. These are real supply chains that may only stretch across one or two towns or six continents. Supply chains link producers, distributors, and consumers. They enable outsourcing and resourcing. And they are a fixture of modern business since business became modern in the mid-1970's. And with disciplines like Six Sigma, large multi-national supply chains enable massive economies of scale with quality control that previously were only available to the largest organizations with fixed multi-year labor contracts.
Today every organization also has large distributed Data Supply Chains. Some parts may be automated, others batch, and still others quite labor intensive. The variety and function is often the ugly mess the CIO would not like the world to see. And with seldom exception, they are not "governed" with anywhere near the same quality control and rigor as are real supply chains. When an oil company puts down a new oil terminal, well defined engineering processes are used to map out every step of production from well head to refinery. If Data Supply Chains are intended to capture the same kinds of flows with information, the methods used are mostly ad-hoc, one-off dependent upon project leader, never to be repeated again. And the result today is that companies have tens, hundreds, and even thousands of ad-hoc supply chains designed individually, some existing in their original state for decades. The disconnects create massive inefficiencies, quality control problems, and functional friction.
What every company should be doing is inventorying their existing Data Supply Chains and begin re-engineering. There should be one Data Supply Chain engineering standard. And each new real-world supply chain should include a well defined process to create a logical and efficient Data Supply Chain that monitors itself. This is not a small undertaking. But we can't create a Smarter Planet full of sensors and instruments to monitor the changes in our real world if we do not also monitor and instrument, standardize and re-purpose, the changes in our own enterprise.
Every time I speak to an IT audience, I ask "What is Data Governance?" Of course the audience has come to hear me tell them the answer if they do not already know. But I'm more interested in what my audience thinks. Invariably, the answer has words like "Policy Enforcement," "Control," and "Compliance" in it. And to me what this reflects is a desire among IT professionals to expunge chaos and confusion from their world and create order, stability, and simplicity. Perhaps this is very human, but I think our desire to transform complexity to simplicity focuses far to much energy and attention on the world "To Be," or perhaps even on the world "Never-To-Be."
I think we need to spend more time focusing on the world "As Is," the one with dirty, grimy, confusing, and complex Data Supply Chains that are not yet instrumented, monitored, or in any way Smart. It is this world that needs the bright white light of assessment, discussion, policy, implementation, audit, and dynamic steering. This dark and dishonorable world "As Is" is the past most of our Data Governance programs struggle to change in the present with business plan funding for the future. It needs new methods that monitor the information flowing through its electronic veins, real-time auditing of the tools that are used to change it, and brand new business intelligence solutions that analyze past performance, compare them to current conditions, and predict blockages and failures.
In 2009, at the Mohonk Mountain House, back in the place where it all began, surrounded by 52 Data Governance Thought Leaders, I saw again the source that we mistook all these years - Data Governance is a quality control discipline for the Data Supply Chain.
Fix the world that is.
Does the European Union "promise to be true in good times and in bad, in sickness and in health?" Will the Union survive the current Debt Crisis and become more integrated or will it break apart under the pressure and allow insolvent states to exit the common currency?
Can the United States maintain its high standard of living and reduce its debt burden at the same time?
You may read these questions in the press every day and never believe they have everything to do with Data Governance, but they very much do. Governments make tactical decisions every day to increase debt amounts by small fractions thinking that their incremental spending is nothing in comparison to what others have done in the past - failing to see the correlations between current consumption and long term systemic instability.
With 7 billion people on the planet Earth, our societies have become so complex it is impossible with past methods of governance to foresee how policies impact even the smallest ecosystems. So we rely on blunt cause and effect relationships to over-simplify our options and fit our ideas into media soundbites. And the result is non-correlated policies that are anything but smart or predictive.
We seek to change this. We know that without new tools and techniques to see beyond the next effect, every cause will yield policies that fail. We are the IBM Data Governance Council and we see that Data is the raw material of the Information Age and that effective Governance relies on conceptual thinking, integrated approaches, correlated analysis, and a relentless search for truth.
We call this Predictive Governance and this meeting will explore what this means, how it works, and how we as a Community can create predictive models that:
1. See the Relationships between Data Quality and Security & Privacy and Data Architecture and ILM and Metadata and Audit and Reporting and Stewardship and Policy and Organizational Awareness and Business Outcomes - the Forest and the Trees in our Information Ecosystems.
2. Model and Simmulate how new integrated policies, people and technologies are available to Govern in these complex Ecosystems.
3. Understand and articulate these relationships to laymen who only see the problems at hand and have no patience for larger integrated discussions.
Please join us for this important two day event. Participation is open only to members of the IBM Data Governance Council. Organizations wishing to join the Council may sign up for this event and execute a Council Agreement in New York at the meeting.
I am a relative newcomer to System Dynamics. I first learned about systems thinking from Helmut Wilke, german professor who wrote a book called Smart Governance, which talked about systems of governance and their influences on society. I met Professor Wilke in Cologne in 2007 and was so impressed with his ideas I used his book in a course was teaching with Christa Menke-Suedbeck at the Bucerius Law School in Hamburg, Germany.
A few years later, a colleauge introduced me to some work IBM did with the City of Portland to build a very large SD Simulation enabling urban planners to understand how even the smallest policy changes had ripple effects across many municipal departments, neighborhoods, families, and individuals. We created that simulation using VenSim and Forio, and I was immediately captivated by the potential to model and simulate the impact of policy on complex environments.
For over 15 years, I've been an inventor and market builder at IBM. In 1996, I invented Internet Insurance, persuading AIG, Reliance National, Chubb, Codan, and other insurers to invest in developing interent exposure coverage products and underwriting capabilities so that businesses could depend on insurance coverage as they expanded commercial operations online. In 2001, I led a team of IBMers to create the Enterprise Privacy Architecture, which is a patented methodology for embedding privacy policies and obligations into business processes. In 2004, I founded IBM's Data Governance Council and led an international group of 60 companies to create the Data Governance Maturity Model, a vast piece of commonly developed IP that benchmarks Data Governance behaviors across 11 categories and 5 levels of maturity. In 2009, I hosted a series of roundtable forums with large banks, the SEC and the Federal Reserve as we explored the causes and effects of the Credit Crisis and what new standards in risk calculation and expression could be developed to mitigate future crises. And in 2010, I created the Information Governance Community to publish the Maturity Model under an open source license and invite a global community to work with IBM, the Data Governance Council, and many new leaders in developing a larger market for Information Governance and a new leadership role called the Chief Data Officer.
I love building markets through international collaboration and this is why I have urged and lobbied iseeSystems, Ventana, Forio, Anylogic, IBM, and the SD Society to embrace an open standards process at OASIS. SD is a complex discipline that is hard to learn and hard to use. It has grown in episodes over the past 50 years but it has never really broken out of its strong academic foundations. At first, I thought I could help it grow through the Information Governance Community. In 2011, I held a series of informational webinars on SD, the City of Portland Project, some work Steve Peterson had done with urban violence in South Boston. Michael Bean from Forio.com gave us generous amounts of his time to educate our community in how SD works, how models are built, and how simulations can be used to test strategic ideas and transform organizations. Some of our community members built Data Governance models in Vensim and tested them online in Forio.
But widespread adoption eluded us. You can have great webinars with great content and discussions, but that doesn't mean everyone understands what you are talking about. I saw many of my members thinking about systems, but not in a dynamic SD way. They understood the words we used to mean different things and found the math content totally confusing. After six months of work, I had to admit my efforts at Community education were not succeeding.
Undeterred, I started talking about the need for SD Open Standards. In the IT world, Open Standards are a way to spread adoption among vendors because it lowers proprietary barriers to entry in new markets. It enables better software solutions, which end-users appreciate. And the process of Open Standards consideration and specification approval helps build market demand. As early as 2011, I saw clearly that SD lacked a robust IT vendor community. 5 or 6 small vendors providing software modeling tools was a niche market that was not growing.
In 2012, I met iseeSystems at the System Dynamics Conference in St. Gallen. My participation in the conference was very last minute. St. Gallen isn't close to anything in Switzerland and it was summer and I didn't want to travel. But boy am I glad I did. For three days, I saw incredibly thought-provoking transformational work in every industry all using a common SD methodology. I speak at many conferences throughout the world and you never see so many interesting presentations across so many diverse industries written in a common way.
I was blown away by the quality of the content but, sadly, equally depressed by the complete lack of business participation. The conference was run by academics for academics. I was the only representative from a large IT vendor. There were no banks, insurance companies, oil and gas, utilities, governments, or even big 4 consultants attending. The SD Society had a conference in 2011 in Washington DC, so I asked the organizers how many from the federal government had attended. The answer was hardly any. Why the heck not I asked. The answer was no one had thought to prioritize their participation as a target audience. The target audience was local universities.
If the purpose of the SD Society is to service the university marketplace with educational offerings and knowledge transfer, mission accomplished. If the purpose is to grow the industry and attract business audiences, current approaches are inadequate.
This is where OASIS comes in. Following St. Gallen, I went to work persuading my colleagues in IBM that an Open SD standard based on iseeSystems XMILE could help grow business demand for SD simulations. The open standards process would attract new ideas to SD and open the SD Society to new ideas as well. But it took a lot of persuading. I had to sell a vision internally that SD concepts could be used with Big Data analytics to illustrate policy options on complex ecosystems. I had to tell my colleagues that an open standard would allow IBM to embed SD vocabulary in other modeling tools such as Websphere Business Process Modeler, Rational Method Composer, and iLog. And I had to demonstrate that our investment would be modest, the risk small, and the potential payoff reasonable. It took me a year to find the sponsorship I needed to persuade our Standards Commitee to approve IBM"s sponsorship of the OASIS TC.
And that brings us to where we are today. We have a TC. We have a vision for XMILE. These are table stakes. A TC is a sales effort, and we must now expand our market of members to be global, business oriented, diverse, and inclusive. Over the next 24 months, we have to expand TC membership to 70. I'd like to see representation from North America, South America, Asia, Africa, and Europe. I see my job on this Technical Committee is to help expand customer demand for SD solutions and build a far larger market than exists today.
We are not just building a technical standard. We are building a market and I will continue to engage my peers to expand the use of XMILE worldwide as we work to develop an Open Standard for System Dynamics at OASIS.