In the past 18 months, I've written extensively on the need for risk measurement standards, but this IBM Blog interface doesn't make it that easy to find all the articles. To make that easier, I've collected the salient points and made a chronology of them to illustrate the pieces and how they fit together:February 7,2008: Risk Data for Macroeconomic Governancehttp://www-128.ibm.com/developerworks/blogs/page/adler?entry=risk_data_for_macroeconomic_governance
On September 18, 2007 the US Federal Reserve cut the Federal Funds rate by half a percent in response to the looming sub-prime loan scandal. The markets had lost confidence and Banks were holding debt they could not sell. Write offs ensued, and the market forecast looked questionable at best.
At the time, this rate cut was seen as a dramatic response to worsening market conditions and proof that the Fed would act aggressively to protect the economy from the housing bubble. In the next two months, the Fed intervened again to cut rates .25% in October and .25% again in December. Each rate cut was seen as a prudent response to market conditions.
In January 2008, just a few weeks after the last rate cut, the Fed had to intervene again with a very sudden 1.25% cumulative rate cut to stem an asian-driven equity market sell-off following more sub-prime write offs and loss disclosures. In just five months, the Federal Reserve had to intervene five times with a combined interest rate cut of 2.25% following 17 quarter point rate increases in as many months.
This was an incredible see-saw of macro-economic policy - gradual rate increases were followed immediately by sudden rate cuts.
In hindsight, the half-point cut in September 2007 was not very dramatic in comparison to the 1.75% in cuts that followed in the next four months. No one then could have foreseen the volatility in the markets that was to come, or could they have?
Why is it that the US Federal Reserve rate policy was reactive to market volatility? Why didn't their monetary policy, which had run up rates from 1% in June 2003 to 5.25% in
June 2006, anticipate the looming housing bubble and bank losses that would surely ensue? Hadn't Alan Greenspan warned of this outcome in 2005? Didn't we all know the housing joyride would end at some point?
Today, we can see banking and financial market data that shows the risk trends in our rear view mirror. Unfortunately, no one has a mirror that forecasts the future, but they could if capitalized risk data were collected on a systemic basis by banks and shared with the Federal Reserve. The Federal Reserve does an excellent job of studying catastrophic risks and running sophisticated macroeconomic loss models on everything from terrorist attacks to coastal hurricanes. The Fed uses this catastrophic loss data to provide capitalize insurance loss reserves for the US economy - ie, they print more money when very bad things happen.
The insurance reserves got tapped after 9/11 and hurricane Katrina, when the Fed injected huge amounts of liquidity into the economy to stabilize markets and restore confidence. Of course, the timing of catastrophic events can't be forecasted, but the monetary response can be estimated based on a variety of risk factors. the fed constantly analyzes and wargames these risk factors and the success of Fed liquidity and monetary responses to 9/11 and Katrina attest to the diligence of their planning and the value of risk-based forecasting models.
What does this have to do with the sub-prime loan meltdown you ask?
Well, if the Fed had non-catastrophic risk-data forecasting models they could possibly pre-empt loss events with macroeconomic policy tools that could even out some of the worst aspects of the business cycle. Unfortunately, that kind of non-catastrophic risk-data has to come from banks, who until recently were totally incapable of providing that kind of data, let alone using it themselves for their own risk-based policy-making.
That's changing. In the last two years banks around the world have been working to assess and collateralize market, credit, and operational risks as part of the Basel II compliance process. That data isn't normalized across banks, and there are wide disparities in how risks are assessed, calculated, and capitalized from bank to bank, country to country. But the raw data, and the beginnings of the knowhow are, for the first time in history, there. And that data and knowhow can be leveraged to provide new macroeconomic tools for Central Bank policymakers around the world.What's needed are standards in risk assessment, classification, calculation, and the reporting of capitalized risk data from US banks to the Federal Reserve. This may take some years yet to accomplish but the time is right to begin discussing these issues. As US Banks reach Basel II compliance they will be in a position to leverage risk-data for their own self-insurance against non-catastrophic losses, and if they would be willing to share their capitalize risk data they could help the Federal Reserve to reduce market volatility and improve macroeconomic performance for everyone.
Here's a case where regulatory compliance really can improve business performance.
April 11, 2008: Subprime is a Data Governance Challengehttp://www-128.ibm.com/developerworks/blogs/page/adler?entry=subprime_is_a_data_governance
The IMF put out the Global Financial Stability Report last week and it contains a very accurate and sobering description of the systemic failures involved in the Subprime Financial Crisis. It has an institutional focus, and makes some solid observations and recommendations.
The entire report is worth a read, but the Executive Summary contains most of the key points if you just want the meat of the matter:http://www.imf.org/External/Pubs/FT/GFSR/2008/01/index.htm
I will summarize the findings and recommendations that have Data Governance implications:
"The events of the past six months have demonstrated the fragility of the global financial system and raised fundamental questions about the effectiveness of the response by private and public sector institutions. While events are still unfolding, the April 2008 Global Financial Stability Report (GFSR) assesses the vulnerabilities that the system is facing and offers tentative conclusions and policy lessons.
Some key themes that emerge from this analysis include:
• There was a collective failure to appreciate the extent of leverage taken on by a wide range of institutions—banks, monoline insurers, government-sponsored entities, hedge funds—and the associated risks of a disorderly unwinding.
• Private sector risk management, disclosure, financial sector supervision, and regulation all lagged behind the rapid innovation and shifts in business models, leaving scope for excessive risk-taking, weak underwriting, maturity mismatches, and asset price inflation."
What follows are a number of short- and medium-term recommendations relevant to the current episode. Several others groups and for a—such as the Financial Stability Forum, the Joint Forum, the Basel Committee on Banking Supervision—are concurrently developing their own detailed standards and guidance, much of which is likely to address practical issues at a deeper level than the recommendations proposed below.
In the short term...
The immediate challenge is to reduce the duration and severity of the crisis. Actions that focus on reducing uncertainty and strengthening confidence in mature market financial systems should be the first priority. Some steps can be accomplished by the private sector without the need for formal regulation. Others, where the public-good nature of the problem precludes a purely private solution, will require official sector involvement.
Areas in which the private sector could usefully contribute are:
• Disclosure. Providing timely and consistent reporting of exposures and valuation methods to the public, particularly for structured credit products and other illiquid assets, will help alleviate uncertainties about regulated financial institutions’ positions.
• Overall risk management. Institutions could usefully disclose broad strategies that aim to correct the risk management failings that may have contributed to losses and liquidity difficulties. Governance structures and the integration of the management of different types of risk across the institution need to be improved. Counterparty risk management has also resurfaced as an issue to address. A re-examination of the progress made over the last decade and gaps that are still present (perhaps inadequate information or risk management structures) will need to be closed.
• Consistency of treatment. Along with auditors, supervisors can encourage transparencyand ensure the consistency of approach for difficult-to-value securities so that accountingand valuation discrepancies across global financial institutions are minimized. Supervisorsshould be able to evaluate the robustness of the models used by regulated entities to value securities. Some latitude in the strict application of fair value accounting during stressful events may need to be more formally recognized.• More intense supervision. Supervisors will need to better assess capital adequacy related to risks that may not be covered in Pillar 1 of the Basel II framework. More attention could be paid to ensuring that banks have an appropriate risk management system (including for market and liquidity risks) and a strong internal governance structure. When supervisors are not satisfied that risk is being appropriately managed or that adequate contingency plans are in place, they should be able to insist on greater capital and liquidity buffers.
In the medium term...
More fundamental changes are needed over the medium term. Policymakers should avoid a “rush to regulate,” especially in ways that unduly stifle innovation or that could exacerbate the effects of the current credit squeeze. Moreover, the Basel II capital accord, if implemented rigorously, already provides scope for improvements in the banking area. Nonetheless, there are areas that need further scrutiny, especially as regards structured products and treatment of off-balance-sheet entities, and thus further adjustments to frameworks are needed.
The private sector could usefully move in the following directions:
• Standardization of some components of structured finance products. This could help increase market participants’ understanding of risks, facilitate the development of a secondary market with more liquidity, and help the comparability of valuation. Standardization could also facilitate the development of a clearinghouse that would mutualize counterparty risks associated with these types of over-the-counter products.
• Transparency at origination and subsequently. Investors will be better able to assess the risk of securitized products if they receive more timely, comprehensible, and adequate information about the underlying assets and the sensitivity of valuation to various assumptions.
• Reform of rating systems. A differentiated rating scale for structured credit products was recommended in the April 2006 GFSR. Also, additional information on the vulnerability of structured credit products to downgrades would need to accompany the new scale for it to be meaningful. This step may require a reassessment of the regulatory and supervisory treatment of rated securities.
• Transparency and disclosure. Originators should disclose to their investors relevant aggregate information on key risks in off-balance-sheet entities on a timely and regular basis. These should include the reliance by institutions on credit risk mitigation instruments such as insurance, and the degree to which the risks reside with the sponsor, particularly in cases of distress. More generally, convergence of disclosure practices (e.g., timing and content) internationally should be considered by standard setters and regulators.
• Tighten oversight of mortgage originators. In the United States, broadening 2006 and 2007 bank guidance notes on good lending practices to cover nonbank mortgage originators should be considered. The efficiency of coordination across banking regulators would also be enhanced if the fragmentation across the various regulatory bodies were addressed. Consideration could be given to devising mechanisms that would leave originators with a financial stake in the loans they originate."
New standards and banking practices will clearly be needed moving forward. But we already have most of the regulations we need to mitigate most risks identified in the report. Indeed, one of the great ironies of the crisis is how little Banks used their own fraud and risk management systems to catch underwriting errors and omissions in Loan Origination applications, House Assessments, risk capitalization, etc.
I suspect that the IMF's warning on regulation will not be heeded in Washington, though I do hope regulators will listen to the seasoned advice of some Data Governance veterans because this is a crisis with so many Data Governance challenges.May 17, 2008: Nordic banks step in to back Iceland
The Financial Times reported today
"Three Nordic central banks unveiled an unprecedented €1.5bn emergency funding package on Friday to support Iceland’s troubled currency and stabilise its banking system as the tiny north Atlantic nation tries to fend off the effects of the global credit crisis.
The plan allows Iceland’s central bank to acquire up to €500m ($775m, £400m) each from the central banks of Sweden, Denmark and Norway in the case of an emergency, the first time the region’s central banks have joined forces to help a troubled neighbour."http://www.ft.com/cms/s/0/56a76dd4-2327-11dd-b214-000077b07658.html
This story illustrates the downstream impact of polluted data in the global economy. But of course, for the rest of us not living in Iceland the global credit crunch has impacted our lives in other indirect ways.
Since September 2007, when the US Federal Reserve started cutting interest rates in a drastic program that shaved 3.25% of the Discount Rate in 7 months, the price of oil (valued in depreciated dollars) has increased 50% from $80 to $127 per barrel. Food costs have skyrocketed, and countries around the world are challenged to find credit for government bonds. Inflation, thanks to Subprime, is a growing threat to the world economy and to the lives of poor people living at the edge of subsistence.
But how is this related to Toxic Content and Data Governance you ask?
Well, of course the public Subprime narrative states that Banks invested in fancy hybrid home loans extended to subprime borrowers and created inherent risk in the market that was compounded through exotic derivatives that no one understood. This is partially true, and many banks have since admitted that they had poor internal risk governance.
But there is another part of the story that doesn't attract as much publicity. In 2005, at the peak of the Housing Bubble in the US, Alan Greenspan went before Congress to declare that the US housing market was "frothing." At about the same time, US Regulators decided to relax underwriting guidelines on new mortgage applications for a key segment of the marketplace - self-employed individuals.
Self-employed individuals face a moral hazard when they apply for a mortgage.
This hazard is well known in the residential mortgage marketplace. It occurs when a self-employed individual has to demonstrate their income to obtain a loan. People who are employed by big companies get direct deposit pay checks and have income tax statements which closely match their real income. Self-employed individuals don't get regular pay checks and have tax statements that, shall we say, may frequently differ from real income.
This is especially true for the segment of the population that is paid in cash. Producing documentation of "real" income for these people is a challenge that typically caused the loan underwriting process to take longer for self-employed individuals than employed.
And in 2005, as housing prices peaked and interest rates rose .25% each month, mortgage volume started to decline and for some reason US regulators chose to remove income documentation standards for the self-employed. From that time forward, they only had to make an income declaration.
Case in point. I have a friend who is a mortgage broker. He had a customer who owns a Pizza Parlor and wanted to buy a house. This customer had a good credit score and was a prime buyer. His Loan-to-Value Ratio was good. As a self-employed individual he was paid in cash, and he declared his income to be $10K a month.
But when my friend input the numbers into the super-fast online loan application it turned out that his debt ratio was too high. He had some car loans and credit card debt that put the ratio above 41%, and the loan could not get through. So my friend simply changed his declared income to $12k per month and the loan got approved.
In 2007, what I described above was a compliant business process for a self-employed mortgage loan application. Income only had to be declared, not verified.
In fact, by this time in the marketplace most banks had automated underwriting applications that turned out a rate quote in 40 seconds for conforming rate mortgages. But what was obviously dangerous about this process is that the Pizza Parlor owner made an income declaration without documentation. $10K might have been his best income in his best month in the year. $12K per month might have been his fantasy income. Maybe his real income is closer to $8500 a month.
But now he owns a home with an adjustable rate mortgage that he can barely afford at the current rate he's paying and certainly can't afford when the rate adjusts up.
This is a story that was repeated thousands of times in 2005-7, which is one reason why delinquency and foreclosure rates on those vintages of prime AND subprime loans is at 12-16%.
The fateful regulatory decision in 2005 to relax documentation standards in loan underwriting allowed vast amounts of Toxic Loan Content (poisonously polluted data) to enter the banking system through automated underwriting systems that got their business rules from the regulators. That created systemic risk that was entirely opaque to the MBS issuers, Rating Agencies, CDO issuers, and the marketplace. And unless the credit risk is transparent to investors, the market can't price risk correctly and default is in an inevitable outcome.
By 2005, banks were already aware of rising risk from documented subprime loans and were raising interest rates to collateralize their risk. They just weren't aware of the undocumented risks, which left their reserves deficient to cover their exposures.
But it didn't have to end this way. If regulators in 2005 had just left underwriting documentation regulations in place, or even strengthened them, the Housing market would have seen a soft-landing and the credit crisis would not have happened.
I see quite a few important lessons here for the future:
1. Regulations are not Holy Scripture. Automating Compliance in IT can just as easily automate exposure as it can value.
2. We must learn to measure data quality and validate before we trust it. Data can pollute our businesses, our societies, and our lives and we must invest in methods and technologies to certify its quality on a continual basis to enhance and protect the value of our businesses.
3. The marketplace needs new tools to measure and price business risk. Regulators should not measure risk in businesses and force process changes. This is a reactive and inefficient method.
4. Transparency creates its own rules. Businesses should be required to report and capitalize (self-insure) their risks regularly to the marketplace so that self-regulating market economics can arbitrate between good business stewardship and folly. That arbitration will be reflected in stock prices, which is far more efficient than regulatory sanctions.
We will need many new Data Governance Solutions to help banking institutions across the world adjust to the increased scrutiny the post-Subprime world will bring. But most of all, we will need international forums, like the Data Governance Council, to discuss these issues and bring different perspectives forward because this crisis was eminently avoidable. And it is only through communication that we can develop more mature practices to prevent it again in the future.July 14, 2008: The US Federal Reserve Needs Data Governancehttp://www-128.ibm.com/developerworks/blogs/page/adler?entry=the_us_federal_reserve_needs
The US Federal Reserve announced new mortgage lending standards today that are designed to address so-called deceptive business practices among lenders.
Those measures include:
¶Bar lenders from making loans without proof of a borrower’s income.
¶Require lenders to make sure risky borrowers set aside money to pay for taxes and insurance.
¶Restrict lenders from penalizing risky borrowers who pay loans off early. Such ”prepayment” penalties are banned if the payment can change during the initial four years of the mortgage. In other cases, a penalty cannot be imposed in the first two years of the mortgage.
¶Prohibit lenders from making a loan without considering a borrower’s ability to repay a home loan from sources other than the home’s value.
The borrower need not have to prove that the lender engaged in a “pattern or practice” for this to be deemed a violation. That marks a change — sought by consumer advocates — from the Fed’s initial proposal and should make it easier for borrowers to lodge a complaint.
“Rates of mortgage delinquencies and foreclosures have been increasing rapidly lately, imposing large costs on borrowers, their communities and the national economy,” Mr. Ben Bernanke, the Federal Reserve Chairman, said.
“Although the high rate of delinquency has a number of causes, it seems clear that unfair or deceptive acts and practices by lenders resulted in the extension of many loans, particularly high-cost loans, that were inappropriate for or misled the borrower,” he added.
Excellent. Markets around the world can feel confident again that the US Federal Reserve has rooted out the major mortgage lending problems confronting the US Economy and has the entire situation under control.
It is beyond shocking that deceptive lending practices like this even exist in the most "efficient mortgage market in the world" (according to a 2006 IMF Mortgage Market Survey). What's more shocking is that the Fed knew about these practices, had data attesting to their impact on rising rates of mortgage fraud going back to 2005, and did nothing about it until today.
And how do I know that, you ask?
Well, the Fed's own economists put out an insightful summary of what went wrong in the current credit crisis and you can read it here:http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1020396
The first draft of this report was published in October 2007. And in perfect hindsight, the economists concluded,
"Were problems in the subprime mortgage market apparent before the actual crisis showed signs in 2007? Our answer is yes, at least by the end of 2005. Using the data available only at the end of 2005, we show that the monotonic degradation of the subprime market was already apparent. Loan quality had been worsening for five consecutive years at that point. Rapid appreciation in housing prices masked the deterioration in the subprime mortgage market and thus the true riskiness of subprime mortgage loans. When housing prices stopped climbing, the risk in the market became apparent."
Now the US Federal Reserve would not be the first organization in history to have better hindsight than foresight, but shouldn't we expect them at least to move faster with policy controls when the global credit market is facing the second worst crisis in history? And if it takes the Federal Reserve 2 years to study market data and write a telling report more than three months after the crisis has hit, and 8 months more to digest and issue lending guidelines to restrict fraud in the mortgage marketplace, how long exactly will it take them to react when Hank Paulsen consolidates all financial regulation in their hands?
To me this story offers some important lessons that I do hope Congress recognizes:
1. Regulatory Consolidation is not a panacea. Consolidated beauracracies do not historically produce operational efficiency. Witness the Department of Homeland Security and the performance of FEMA during Hurricane Katrina.
2. Data is useless without people empowered to act. The Fed had ample data to control mortgage lending fraud and prevent the worst aspects of the current credit crisis and it either chose not to act or its internal governance is so poor that there was no mechanism in place to forecast non-monetary economic risks and make micro-policy adjustments.
3. More important than regulatory consolidation, Congress should review the operational procedures and data governance practices at the Fed itself. A GAO Audit of Fed operational procedures and internal, below the Board, decision-making would be a great start!
4. The Subprime Credit Crisis was preventable! The Fed had the data and they had the economic skills to use it. They proved today that they even had the regulatory mandate to affect the changes in lending guidelines necessary.
Congress, the US Public, and the world at large, have a right to know what took them so long to use their own data before we entrust that organization with even more regulatory responsibility.
From where I am sitting, the Fed really needs Data Governance.September 18, 2008: Enterprise Risk Management is a Myth
The stunning market downturn this week has revealed the breathtaking lack of Enterprise Risk Management in Banks, Brokers, and Washington.
Let me be clear; Risk Taking is not the same as Risk Management.
We've had 7 years of extraordinary risk taking in our mortgage markets, financial markets, tax policies, and war policies.
No one, in Banking or Government, can say with any confidence what their risk profile is, calculate probability of loss, or forecast future exposures.
This makes all claims of Enterprise Risk Management one of the most stellar myths in modern business. Anyone telling you they are doing it is either lying or a rare breed indeed.
I do hope that someone is recording the history of losses at the institutions failing and surviving every day. Because the historical study of that record of loss - and new regulations mandating real Enterprise Risk Management and loss capitalization - is the only thing that will prevent them from happening again.
November 20, 2008: Three Things Basel Forgot
Today, The Basel Committee on Banking Supervision announced a new strategy to address shortcomings in its own global regulatory structure. The proposal creates new capital requirements, leverage ratios, and risk measurements designed to more carefully regulate banking practices across the globe.
The proposal includes the following elements:
# strengthening the risk capture of the Basel II framework (in particular for trading book and off-balance sheet exposures);# enhancing the quality of Tier 1 capital;
# building additional shock absorbers into the capital framework that can be drawn upon during periods of stress and dampen procyclicality;
# evaluating the need to supplement risk-based measures with simple gross measures of exposure in both prudential and risk management frameworks to help contain leverage in the banking system;# strengthening supervisory frameworks to assess funding liquidity at cross-border banks;
# leveraging Basel II to strengthen risk management and governance practices at banks;
# strengthening counterparty credit risk capital, risk management and disclosure at banks; and
# promoting globally coordinated supervisory follow-up exercises to ensure implementation of supervisory and industry sound principles.
Strengthening liquidity and solvency requirements seem like regretful afterthoughts during a time of historically low liquidity and high insolvency, but better late than never. One does wish that the Basel Committee had applied these measures as forethoughts rather than afterthoughts, but that's human nature.There are three elements missing that I hope to see emerge in 2009:
1. A Global Loss History DB of anonymous credit, market, and operational incidents, events, and losses from every Basel conforming institution. Individual institutions do not have enough loss history to compare their past exposures and "claims" to trend and forecast. Industry and geographic loss information is needed to better inform decision-making at banking institutions. 3rd Party loss data is available to every insurance company for all lines of business. Only the banking community could conceive of risk measurement programs without 3rd party institutional validation.
The Operational Risk Exchange has been aggregating banking loss data for operational risk among the 41 banks who participate in that consortium for 3 years. That model is valid, but the sample size is too small even for ORX. I hope the Basel Committee sees ORX as a valid architype that should be replicated worldwide with each Central Bank collecting the anonymized loss data from each member institution and sharing that loss data worldwide so that all financial institutions can compare their own loss trends to global trends and forecast future exposures more accurately.
2. An XBRL for Risk Reporting Taxonomy. Banks can't report loss events without a global taxonomy so that everyone can agree on what to call things and what things mean when they are reported. Even within banks, the word Risk has many different meanings to different people. For business people, Risk is an omnipresent feature of life, an attribute to calculate potential returns or losses in investments. Many business careers are made by taking risks. For an IT person, Risk is something to be avoided at all costs, the result of flaws in architecture that lead to vulnerabilities and loss. Many IT careers are lost by taking risks.
Business and IT can sit at the same table and have exhaustive conversations about Risk, each thinking they understand the other, and walk away having fundamentally different idiomatic understandings of what was discussed. That misunderstanding is often a source of new risk.
XBRL (Extensible Business Reporting Language) is an XML language for describing business terms, and the relationship of terms, in a report. It enables semantic clarity of terminology, and that clarity is absolutely essential for the accurate recording and reporting of credit, market, and operational incidents, loss events, and losses.
A Risk Taxonomy is like an alphabet - the letters alone convey no meaning, but they are the foundational elements that allow humans to understand each other. We desperately need a new alphabet to describe Risk - incidents, events, losses, claims, exposures, forecasts, reserves - so that firms everywhere can aggregate loss information, analyze it with standard actuarial methods, compare past exposures to present conditions and opportunities, and forecast potential outcomes to illuminate options.
A year ago, I wrote on this page about the need for new macro-economic tools to enable Central Banks to measure aggregate risk taking in the financial world. An XBRL Taxonomy of Risk is a fundamental building block to enable interoperability and standard practices in the measuring and reporting of risk. Those standards in turn will enable Central Banks to manage vast databases of loss history and trend analysis that will inform policymakers and member banks to make better decisions that produce better returns. We will still need new information management software and governance models to make sure the right information gets to the right people at the right time, but none of that is possible without a standard alphabet and vocabulary to describe what's being recorded and read.
Recently, I announced an IBM Data Governance Council initiative to develop an XBRL Taxonomy for Risk. We are inviting all interested parties - banks, broker/dealers, hedge funds, consortia, think tanks, and regulators - to participate in this initiative. We will be working closely with XBRL International and XBRL.US to share ideas in an open and transparent process to bring forward a standards proposal quickly. If you are interested in participating, please drop me a line.
3. Lets bring back Glass-Steagall. Gee what a great idea. No leverage ratios, because investment banks can't leverage with bank deposits at all. Banks, Brokerages, Hedge Funds, and Insurance Companies all need to have their activities segregated. It isn't enough to insist on new solvency, liquidity, and risk measures. We need to separate temptation from action. And when all three of these things are done - new solvency requirements to shore up assets on the balance sheet, risk taxonomies and loss history data to forecast future exposures, and Glass-Steagall V2 - we'll have risk tied up in a knot... until it's not.