On the way back from Singapore, I bunked with the CFO of a large consumer goods company. After talking about family, kids, and ambitions after the rat race, we got to talking about energy policy and changes in global manufacturing.
There was a time when companies like his set down new plants for 50 years. Nowadays, plants are built for 10 years max, and companies move manufacturing around the world based on a variety of factors:
Tax abatementsCheap LaborProximity to Local MarketsEnergy Costs
The latter is changing behavior in consumers across the world and it is also having an impact on corporations. Labor rates have been rising in China and India for some time, and high energy prices are negating labor advantages in those countries. Manufacturers are looking to move plants closer to source markets to minimize transportation costs.
That means that manufacturers will be moving plants and jobs closer to the places where they sell but also closer to places with cheap energy, like Iceland with limitless free geothermal, or Arizona with lots of sunlight.
In Europe there are towns that have become energy self-sufficient and sell the excess to the power grid. Companies will look for locations like that to relocate and cities, counties, and towns should start investing in renewable energy quickly if they want to be attractive to companies wishing to relocate jobs from abroad.
This is the upside to expensive energy that should have every community in America and across the world scrambling to invest in solar, wind, water, and geothermal - not only because it is good for the environment and can save homeowners money. But also because it can help bring back many manufacturing jobs lost to cheap labor - now not so cheap and with expensive transportation to source markets.
Of course, other countries with sensible and farsighted energy policies will be the big winners. Iceland, for example, despite its depressed currency has nearly limitless geothermal reserves that will attract many manufacturers.
But America can catch up and cities should now be discussing policy changes to become renewable energy exporters.[Read More]
Adler on Data Governance
From archive: May 2008 X
DataGovernor 120000GKJR 1,070 Views
The Financial Times reported today
"Three Nordic central banks unveiled an unprecedented €1.5bn emergency funding package on Friday to support Iceland’s troubled currency and stabilise its banking system as the tiny north Atlantic nation tries to fend off the effects of the global credit crisis.
The plan allows Iceland’s central bank to acquire up to €500m ($775m, £400m) each from the central banks of Sweden, Denmark and Norway in the case of an emergency, the first time the region’s central banks have joined forces to help a troubled neighbour."
This story illustrates the downstream impact of polluted data in the global economy. But of course, for the rest of us not living in Iceland the global credit crunch has impacted our lives in other indirect ways. Since September 2007, when the US Federal Reserve started cutting interest rates in a drastic program that shaved 3.25% of the Discount Rate in 7 months, the price of oil (valued in depreciated dollars) has increased 50% from $80 to $127 per barrel. Food costs have skyrocketed, and countries around the world are challenged to find credit for government bonds. Inflation, thanks to Subprime, is a growing threat to the world economy and to the lives of poor people living at the edge of subsistence.
But how is this related to Toxic Content and Data Governance you ask?
Well, of course the public Subprime narrative states that Banks invested in fancy hybrid home loans extended to subprime borrowers and created inherent risk in the market that was compounded through exotic derivatives that no one understood. This is partially true, and many banks have since admitted that they had poor internal risk governance.
But there is another part of the story that doesn't attract as much publicity. In 2005, at the peak of the Housing Bubble in the US, Alan Greenspan went before Congress to declare that the US housing market was "frothing." At about the same time, US Regulators decided to relax underwriting guidelines on new mortgage applications for a key segment of the marketplace - self-employed individuals.
Self-employed individuals face a moral hazard when they apply for a mortgage. This hazard is well known in the residential mortgage marketplace. It occurs when a self-employed individual has to demonstrate their income to obtain a loan. People who are employed by big companies get direct deposit pay checks and have income tax statements which closely match their real income. Self-employed individuals don't get regular pay checks and have tax statements that, shall we say, may frequently differ from real income.
This is especially true for the segment of the population that is paid in cash. Producing documentation of "real" income for these people is a challenge that typically caused the loan underwriting process to take longer for self-employed individuals than employed. And in 2005, as housing prices peaked and interest rates rose .25% each month, mortgage volume started to decline and for some reason US regulators chose to remove income documentation standards for the self-employed. From that time forward, they only had to make an income declaration.
Case in point. I have a friend who is a mortgage broker. He had a customer who owns a Pizza Parlor and wanted to buy a house. This customer had a good credit score and was a prime buyer. His Loan-to-Value Ratio was good. As a self-employed individual he was paid in cash, and he declared his income to be $10K a month. But when my friend input the numbers into the super-fast online loan application it turned out that his debt ratio was too high. He had some car loans and credit card debt that put the ratio above 41%, and the loan could not get through. So my friend simply changed his declared income to $12k per month and the loan got approved.
In 2007, what I described above was a compliant business process for a self-employed mortgage loan application. Income only had to be declared, not verified. In fact, by this time in the marketplace most banks had automated underwriting applications that turned out a rate quote in 40 seconds for conforming rate mortgages. But what was obviously dangerous about this process is that the Pizza Parlor owner made an income declaration without documentation. $10K might have been his best income in his best month in the year. $12K per month might have been his fantasy income. Maybe his real income is closer to $8500 a month.
But now he owns a home with an adjustable rate mortgage that he can barely afford at the current rate he's paying and certainly can't afford when the rate adjusts up.
This is a story that was repeated thousands of times in 2005-7, which is one reason why delinquency and foreclosure rates on those vintages of prime AND subprime loans is at 12-16%.
The fateful regulatory decision in 2005 to relax documentation standards in loan underwriting allowed vast amounts of Toxic Loan Content (poisonously polluted data) to enter the banking system through automated underwriting systems that got their business rules from the regulators. That created systemic risk that was entirely opaque to the MBS issuers, Rating Agencies, CDO issuers, and the marketplace. And unless the credit risk is transparent to investors, the market can't price risk correctly and default is in an inevitable outcome.
By 2005, banks were already aware of rising risk from documented subprime loans and were raising interest rates to collateralize their risk. They just weren't aware of the undocumented risks, which left their reserves deficient to cover their exposures.
But it didn't have to end this way. If regulators in 2005 had just left underwriting documentation regulations in place, or even strengthened them, the Housing market would have seen a soft-landing and the credit crisis would not have happened.
I see quite a few important lessons here for the future:
1. Regulations are not Holy Scripture. Automating Compliance in IT can just as easily automate exposure as it can value.
2. We must learn to measure data quality and validate before we trust it. Data can pollute our businesses, our societies, and our lives and we must invest in methods and technologies to certify its quality on a continual basis to enhance and protect the value of our businesses.
3. The marketplace needs new tools to measure and price business risk. Regulators should not measure risk in businesses and force process changes. This is a reactive and inefficient method.
4. Transparency creates its own rules. Businesses should be required to report and capitalize (self-insure) their risks regularly to the marketplace so that self-regulating market economics can arbitrate between good business stewardship and folly. That arbitration will be reflected in stock prices, which is far more efficient than regulatory sanctions.
We will need many new Data Governance Solutions to help banking institutions across the world adjust to the increased scrutiny the post-Subprime world will bring. But most of all, we will need international forums, like the Data Governance Council, to discuss these issues and bring different perspectives forward because this crisis was eminently avoidable. And it is only through communication that we can develop more mature practices to prevent it again in the future.[Read More]
DataGovernor 120000GKJR 1,014 Views
Last week, I hosted a Data Governance Council Meeting in Kuala Lumpur. 34 participants from around the world attended and we had some excellent discussions about a wide variety of Data Governance challenges. While political structures in ASEAN companies are more rigid than in other parts of the world, the basic issues we discussed were the same challenges I've heard other companies talk about for the past four years.
Globalization means that change has a new velocity. Ideas travel the world from one geography to another at a much faster rate today, and that makes regional differences in the way things work less a factor in international business.
Case in point. During the Council meeting, one member from ASEAN talked about the challenges they had getting bank branches to address Data Quality at source. Branches are measured on Sales, and IT could not get branches to care about the poor quality of data input as a result of new account sales. Other participants chimed in and talked about various technical solutions for cleaning data after it was input. They stressed the need to make upper management aware of the problem. The member was not appeased. He had tried all those solutions and none had worked.
Next we called a member from the US and asked him if he also had this problem and what he would do to solve it. He replied that his bank had indeed confronted that challenge and they solved it by convening their Data Governance Board and discussing the issue with the HR Compensation Committee.
Their solution was to change the way branches were compensated by rewarding cross-selling to existing accounts as much as selling to new accounts. This simple change in the compensation plan made branches more acutely aware of the impact of their own bad data quality practices on their own ability to cross-sell into existing accounts. That in turn let branches to improve their data quality at source and the entire institution saw a net gain in Value Creation as a result.
For me, this story perfectly illustrated that Data challenges need common Governance institutions to discuss complex technology issues and discover solutions that may hinge on business solutions and benefits. It also demonstrated the value of an international forum like the IBM Data Governance Council, where companies from across the world can share common problems and relate ingenious solutions - thereby hastening the velocity of change.[Read More]
DataGovernor 120000GKJR 1,052 Views
In the Data Governance Community there continues to be confusion about how to Govern Data. Let me be clear: You Can't.
Data is dumb. It has no life, no self-interest. It's needs do not conflict with others. It forms no self-organizing factions, and it's vices require no political appeasement.
People can be governed, and the goal of Data Governance is to affect organizational behavior, to build accountability, over, with, and against Data.
Today, most Data Governance initiatives begin with a Board, a political institution with x-organizational representation ("factions" as John Madison called them). This board should evaluate complex issues with normalized assessment processes, providing a common forum to air issues, explore challenges, and render policy decisions and revisions...
"And time yet for a hundred indecisions, And for a hundred visions and revisions, Before the taking of a toast and tea." - The Love Song of J. Alfred Prufrock, T.S. Elliott
This process, imperfect as it is, requires some governing maturity to dynamically steer organizational behavior, and it should really be seen as an initial step on the road to enlightened Data Governance. A key inhibitor in this process is the lack of organizational data reporting - data about what is going on, how policies are being implemented, organizational roadblocks, stewardship challenges, etc.
But to Govern People well, we need better Data. Better Data not only to inform Data Governors and Stewards, but also to inform People to make better Data Governance decisions on their own.
Case in point: Energy Conservation. Might seem a little off topic, but I think the analogy is apropos.
Denmark today is the most energy efficient country in the world. Despite the fact that it is a net oil exporting nation thanks to rich oil deposits in the North Sea off the Faroe Islands, Denmark gets close to 60% of its energy needs from renewable sources such as Wind, Geothermal, Solar, etc.
Cars and Benzine are heavily taxed, but people have choices. Despite 220% car taxes and 50% gas taxes, People can choose, if they can afford it, to drive big Jeep Grand Cherokee's with V8 motors or tiny Fiat's diesels.
Taxes on cars, registration fees, insurance, and even Benzine are weighted based on carbon emissions, and the emission information is published along with the tax rates.
This is a Governance Policy that leverages Information to inform decision-making. Obviously the Danish Government is working to change people's behavior by associating higher taxes with higher carbon emissions, but people still have a choice. They can pay more to drive more, and the State takes in that extra tax and uses it to subsidize renewable energy sources.
The point here is that we can attempt to Govern People using Data by gathering organizational information and leveraging it for informed decision-making by the Board, or we can pass it on to the People, as "Tax" or "Subsidy", and let human self-interest create an internal market for decision-making about value and risk.
"Bad decisions" can be taxed at a higher rate than "Good decisions," and the excess remittances can be used to fund "clean data" initiatives.
More on this topic in my next blog...[Read More]