How do you zero in on the right information to make the best decisions with all the technology, data, and new analytics techniques available to you today? How do you find opportunities or identify problems before anyone else?
We have entered the “Age of Required Knowledge.” With all the data available to us—internally and externally—employees and executives are expected to know. There is a “cost of not knowing.” Getting caught not knowing could lead to sensational news headlines accompanied by loss of shareholder value, loss of customers, and even industry fines.
So, how do you find opportunities or identify problems before anyone else? How do you avoid getting caught not knowing something you should? And, how do you also open the opportunity to make better data-driven decisions and find opportunities for competitive advantage? We studied market trends, interviewed hundreds of customers, and reviewed thousands of projects and the common theme we have been hearing all around is: “smarts.”
When you look at business results, it will naturally lead to questions about why certain things are happening. It’s how you answer those questions that determine your level of competitive advantage. Companies that apply “smarts” to those questions are relying on cognitive services, machine learning, optimization, and pattern-based planning to drive sounder decisions and identify trends before they could even know which questions to ask.
But, it’s easy to be misled. Here are some concrete, public examples of how organizations have “missed the boat” because their analytics deceived them, based on four trends we uncovered in our research that were leading to disastrous results for prospects and clients.
1. Missing data
Your analytics can easily deceive you if you’re missing data. To make informed decisions, you need to synthesize all the relevant information into your decisions. Companies have made vast improvements in terms of incorporating internal data, but there is a risk of getting blindsided by information that can only be found in external data. Think weather, Dun and Bradstreet, economic indicators, or social media.
In April of this year, a major US airline had an unfortunate incident where a passenger was physically removed from a plane. The company could see the metrics related to social media activity, but by the way they were reacting, it became clear they were missing social sentiment information. The initial incident was rough enough, but the CEO made it worse with a cold, victim-blaming speech. The airline lost a billion dollars in market cap in under 8 hours; not because of the incident itself, but as a reaction to the CEO’s statement blaming the passenger. If only they had been more attuned to the sentiment of their long time customers, they could have reacted faster and headed off the stock slide. The cost of not knowing. (See more examples of social media fails here.)
2. Incorrect data
Incorrect data can create deceptive analytics. Excel remains the BI tool of choice for many business users and analysts. So, there is no shortage of stories where a transposed number, missing decimal, or issue with a minus sign wreaked havoc. In fact, a Forbes article suggests that “Excel might be the most dangerous software on the planet.”
Today, we see manual checks built into processes that involve manual entry. However, the bigger problem lies in places where companies have outgrown legacy systems and use complex Excel models for calculations and to transform numbers as part of a workflow. The very nature of Excel is that the calculation lives in each cell and no mechanism can ensure accuracy. A complex workbook can have thousands of calculations. A MarketWatch article titled, “88% of Spreadsheets have errors,” cautions that “Spreadsheets, even after careful development, contain errors in 1% or more of all formula cells.”
This is what happened to a large US-based financial services company. They had outgrown their accounting system. So, they inserted a complex Excel model into a process that exported all open positions into Excel to price them at current market rates and return the values back into a work stream. Unbeknownst to them at the time, there was an error in one of the calculations that led to Fannie Mae overstating revenue by more than $1 billion. When they announced the correction, their stock dropped $2.25 per share. The cost of not knowing.
3. Misrepresented data
Analytics adoption began as IT-led, coordinated projects. But, as desktop and cloud applications made analytics more accessible to individuals, their use spread across the enterprise. Michael Goul is an Associate Dean for Research and Professor of Information Systems at Arizona State University’s W. P. Carey School of Business. He has spent the last few decades studying artificial intelligence and business analytics. He agrees that data science has the potential to revolutionize commerce. But, he also thinks too many companies are rushing headlong into the field without putting proper governance systems in place. In some cases, this has led to disaster. The next two examples of how analytics can deceive are related to governance.
Our third example, misrepresented data, demonstrates how lack of analytics governance provides an opportunity for fraudulent behavior. Take this example that happened to a rapidly growing media company. They expected their customer satisfaction, measured by their NPS score, would likely dip during this time. They were closely watching for an indication that they need to ramp up their investment in customer service. However, as they grew, their NPS remained curiously constant. After 9 months, the CEO was so concerned that he brought in a consulting firm to see if they could explain the unchanging customer service levels during the expansion. It turns out that the person who created the dashboard for the executive team was using an ungovernable desktop BI solution and getting a bonus based on the NPS score performance.
When the score started to slip, the individual did not want to lose his bonus. So, he opened up the spreadsheet that feeds the dashboard and nudged the number up manually so he’d get his bonus. What started out as a small adjustment, grew to a 40% variance over the 9 month period. How many customers got frustrated with the declining level of service in that period and switched providers? Despite their attention to this metric, they got blindsided by not knowing.
4. Misleading data
The fourth way that analytics can deceive you is from the opposite side of misrepresented data. Executives rely on dashboards as single-screen “snapshots” of performance. But dashboards are not the magic view they might seem. Although they can convey important measures, dashboards cannot always provide the nuance and context necessary for effective decisions. The data can be 100% correct, but the visualizations can be very misleading.
Here’s an example from Executive Director of the Data Analytics program at Northwestern’s Kellogg School of Management Joel Shapiro’s article in Harvard Business Review. A large package delivery company wanted to reduce vehicle accidents by offering drivers the option to upgrade their GPS to a system that would help them avoid high-risk traffic areas. After monitoring driver behavior, a front-line manager checked the dashboard and found, to her surprise, that the accident rate was actually higher with the upgrade.
At first glance, it appears that drivers who upgraded their GPS were in more accidents. It would almost be a “no-brainer” for someone to suggest they go back to what they had. In reality, the upgrade was actually quite effective. The manager would have seen this had she compared accident rates for “safe” drivers versus “accident prone” drivers.
For both groups, the upgrade made them safer. So why did the accident rate increase for the entire fleet of drivers while decreasing for each group? Because in this case almost all of the accident-prone drivers chose to use the upgraded device and almost all of the safe drivers kept the old device. Preexisting driver behavior was confused with the effectiveness of the upgrade.
The really interesting thing in this case? The visualizations, and the data, were accurate. They just did not show the whole picture because they weren’t looking at all factors leading to accidents. Joel Shapiro, Executive Director of the Data Analytics Program at Northwestern’s Kellogg School of Management, says “Perhaps the greatest danger in using dashboards for decision making is in misattributing causality when comparing elements on the dashboard.”
Analytics deception: What can you do?
These scenarios are real and they are happening every day. Might they be happening in your organization? Talk to me.
You can also check out IBM’s point of view about deceptive analytics in the video replay of Marc Altshuller’s and Joel Shapiro’s keynote from Analytics University in New Orleans.