This video is contributed by Jeff Jonas, IBM Fellow and Chief Scientist, IBM Entity Analytics
Data Modelers are routinely challenged as they attempt to integrate diverse enterprise wide data. This is especially the case when this data contains natural variability. For example one record says �Bob� and the other says �Robert�. Sometimes this data contains unintentional errors, such as spelling errors or transposition errors. And some data contains intentional errors � professionally fabricated lies � for example a criminal fabricating a fake identity.
The Entity Analytics feature of IBM SPSS Modeler allows Data Modelers to overcome some of the toughest data preparation challenges with unprecedented ease. Using this feature you can generate high quality analytic models, and as a result organizations will gain better business outcomes to recognize both risk and opportunity. In this video, Jeff shows us how!
IBM today announced the newest version of IBM SPSS Statistics software,its integrated family of products that addresses the entire analytical process, from planning to data collection to analysis, reporting and deployment.
The new enhancements in IBM SPSS Statistics v21ensure that the most advanced analytics techniques are available to a broader group of business users, statisticians, analysts and researchers.Making it easier to access and manage big data, set up and perform analyses, and share results across the organization, IBM SPSS Statistics now includes:
�Simulation Modeling� Using Monte Carlo simulation techniques, users can now build better models and assess risk when inputs are uncertain.
�Advanced Techniques for Big Data�Quickly understand large and complex datasets using advanced statistical procedures to provide high accuracy and drive quality decision making.
�Improved Integration�Deploy analytics faster with seamlessaccess to common data types and external programming languages, including Java and IBM Cognos business intelligence.
Monte Carlo Simulation
The new simulation modeling feature is designed to account for uncertainty in data inputs, such as determining how weather conditions affect energy consumption, how costs of materials (e.g., steel prices) affect profitability of a construction project, or to better understand risks around investment planning.
By using Monte Carlo simulation, theunknown inputs and historical distributions are used to create confidence intervalsand visualizations(see graphic) to help make the best decision.
For example, energy and utilities organizations run simulations on potential weather temperatures, compared against historical weather temperatures, to then determine how much energy it would likely need to generate for an 85 degree day on August 31. This process can be repeated many times (typically thousands or tens of thousands of times), resulting in a distribution of outcomes so users can make the best decision.
Unlike other software packages, IBM SPSS Statistics doesn�tforce users to start from scratch, but allows them to leverage existing predictive models and existing data as the starting points for simulation.
IBM SPSS Statistics now makes working with big data easier, more scalable and ensures optimal performance when working with multiple predictors. By introducing a data file comparison tool, users now have the ability to compare datasets or data files to identify any discrepancies and ensure that the data values and records are compatible.
Users can now compare files for better quality control. For example, users can now find discrepancies between data sets that contain responses by the same respondents to a survey, but entered by two different people.
Also, IBM SPSS Statistics now allows operations like sorts and aggregations to be pushed back to the database, where they can be performed faster. Temporary files created by analytical procedures can be distributed across multiple disks, and large files can be compressed to save disk space when sorting, improving performance and speeding up analysis.
For example, users can run multiple analytical jobs at the same time while continuing to work on their desktops at other tasks. Users can also continue to run server jobs while disconnected from the server without sacrificing the quality of their analysis or output, then reconnect to access their completed jobs.
With IBM SPSS Statistics, users can now use a Java� plug-in to call IBM SPSS Statistics functionality from a Java application and have IBM SPSS Statistics output appear in the Java application.
Finally, IBM SPSS Statistics now provides the ability to easily import IBM Cognos business intelligence data for analysis. Users can now read custom data with or without filters, and import predefined reports from IBM Cognos directly into IBM SPSS Statistics.
Guest post from Jing Shyr, Chief Statistician & Distinguished Engineer, IBM Business Analytics
It's the age-old question: why did the chicken cross the road?
With one chicken, the answer is easy to compute.
But, what if there were millions of chickens crossing the road? And each chicken had a mobile device and was tweeting out its opinions, desires, likes/dislikes, photos, and detailed descriptions of what it had for breakfast that morning. Oh, and what if that road was being monitored by millions of sensors?
With current statistical techniques, it's no longer easy to quickly understand why each chicken decided to cross that road and, more importantly, predict when they might cross again.
The business analytics and statistics industry faces tough data analysis challenges in the coming years, including lack of skills, easily consuming analytics, mobility and big data.
Having been around the analytics industry for many years, it is refreshing to see that businesses are taking statistics and data mining results and injecting them directly into the business (and directly into the business process itself). The Catch-22 is that while more and more organizations are realizing the benefits of analytics, finding those professionals with an understanding of how to not only capture and analyze the tsunami of data created daily still requires training and a unique skill set.
A recent McKinsey Global Institute report indicates that over the next seven years the need for highly skilled business intelligence workers in the U.S. alone will dramatically exceed the available workforce � by as much as 60 percent.
I often imagine a business analyst presenting results to an executive the same way I present to my students. When teaching a lesson on modeling, I often ask, "Do you see what I see?" Everyone stares with blank looks on their faces and says, "No! What do you see?"
Herein lies part of the problem. To help counteract the skills shortage, we have to make the software easier to use and force the software to be consumable versus strictly scientific. Communicating results is just as important as the results themselves. I strongly believe that statistical software needs to go through a revolution of its own and become as intuitive as a smartphone.
And speaking of smartphones...
Most statistical software produces an incredible amount of very large tables and charts, making it extremely difficult to comprehend in a mobile environment. I torture my eyes every time I try to read a report on my Blackberry.
Consumability means anywhere, anytime and through any device. It's time we hold statistical s
oftware to a higher standard.
Let me get back to the chickens for a moment.
The volume, velocity and variety of data today is seemingly overwhelming traditional statistical software. Not to be clich�, but Big Data is giving the statistics industry big problems.
Previously, if we wanted to analyze any data, we would follow the same logical flow: decide what we want to predict or classify and build a model by bringing in all the predictors (independent variables). The size of predictors are often well below 100.
Today, however, we are dealing with thousands of different variables making traditional statistical analysis a serious hurdle. The machine capacity is no longer capable and many algorithms have been outpaced by data capacity.
The challenge calls for a new process of data reduction before modeling and new computation algorithms are required to handle millions of records and fields quickly in a distributed environment without passing the data back and forth multiple times.
Most importantly, we don't need to be chicken when it comes to Big Data.
Creating new statistical techniques for Big Data will get us all to the other side of the road, and you'll never have to ask why.
Guest post from Burke Powers, Managing Predictive Analytics Consultant, IBM Business Analytics
Today, every company of appreciable size has some social media presence. Most companies I speak with are either just monitoring social media or are engaged in �spray-and-pray� tactics that are only loosely tied to corporate goals.
To realize the value in social media it is important to integrate social media into broader customer analytics programs and business decision making.
Too often, companies ask, �What are customers saying about us?�
An objective like this is too vague to direct an analysis and identify actions. What we really need to be able to ask is, �Product XYZ will launch in two weeks. We have done A, B, and C campaigns to create awareness and to position the product.
�What kind of buzz (as measured by D, E, and F KPI�s) has this created around each of our message points?
�Are there other topics that we did not anticipate?
�Can we setup real-time reporting of the topics so that we can monitor the customer reaction to the product once they begin using it?
�Can we monitor any emerging, unanticipated topics after the launch?�
The objective should focus on an area of the business where you are confident additional insight can lead to quick improvements. The best opportunity might be related to a product, the service level of a critical customer touch point, competitor actions, a specific brand attribute, or a customer behavior.
The sheer volume of social data requires some planning. There are a limited number of data aggregators (major aggregators include BoardReader, Gnip, & DataSift) and each comes with its own benefits and trade-offs.
To choose an aggregator that best fits your needs, decide how important data history is, the cost of hosting the data, and the importance of access to all social media data (full fire-hose) versus sampling.
Secondly, decide whether to integrate additional data sources. Using the same filtering and reporting for social media and survey verbatims makes them more comparable for analysis and reporting. Also, determining whether to include internal social network data from Yammer or Lotus Connections may be a factor.
3) Plan and Execute the Analytics
By its nature, social media data is going to be different from what most business analysts are used to analyzing. It is unsolicited and unstructured and tends to be rich in attitudinal and usage information. It is frequently strongly positive or strongly negative.
But, it provides tremendous value because it has rich customer narratives of every product feature and customer touch-point that no other data source can offer. It brings traditionally dry analysis to life for business decision makers.
Most existing social media analytics tools offer only a limited ability to search and trend terms as well as view some sort of sentiment. Some allow filtering by the source metadata as well. These are necessary elements of any serious analysis, but stop short of offering the tools needed to take the data to an actionable level.
To be truly useful across many parts of the business, the free-text data needs to be understood in context and translated into an accessible format for reporting and analysis. This capability is one of the strongest differentiators for IBM Cognos Consumer Insight.
4) Motivate Actions
Once the analysis is ready, it is time to deliver the information to the decision maker at the right time, in the appropriate context to make a decision, and in a persuasive manner.
Finally, be sure to include a rich narrative quote that illustrate the argument and provides an additional persuasive hook that augments the analysis and builds buy-in from the �gut� of business leaders.
For example, let�s say your company recently launched the �Wonder Widget.� You are preparing the first report on how the product has been received by customers. Include a positive customer quote to support the data and drive the point home.
Ideally, the quote says exactly what your analysis leads to, �I love your new �Wonder Widget,� it is already making a difference. Except for one thing, the XYZ dial has got to be moved closer to the display so that I don�t have to look away. Fix this and I can easily justify ordering more units.�
There are many social metrics that could be used, from numbers of followers or tweets generated, to the ratio of issues resolved, and to issues raised via social channels.
Additionally, you could track the results via click-throughs usingIBM Coremetricsor email campaign response using IBM Unica.
You also might choose to experiment through customer support channels and monitor perceptions via both social media and surveys.
Finally, the metrics and actions need to be tied back to financial metrics either as revenue-generating or cost-reducing. This may require knowing the cost of resolving an issue via a social channel versus contact center or perhaps the cost of a response via one promotional channel versus another.
Identify a New Objective and Repeat
Now that we�ve gone through the process from beginning to end, it can now be repeated again with a new objective. A disciplined approach using these best practices will generate rapid returns on virtually any social media analytics endeavor.
For more information:
Read the whitepaperon techniques for gaining valuable customer insight with social media analytics
Guest post from Anuj Marfatia, Senior Market Manager, IBM Predictive Analytics Solutions
Usually when traveling for work or vacation and right after takeoff, I undoubtedly begin to panic, much like the mother inHome Alone. I constantly worry that I left the garage door open, the iron on, my kids behind, food out for the dog, or most importantly, if I put my vintage1963 Issue #4 Avengers comic bookback in its protective cover (don�t judge).
Beyond this unnecessary distress, protecting my arm rest from the chatty passenger beside me, and browsing throughSkyMall, I sometimes read the passenger safety document. Have you ever looked through one of these? It�s unintentional comedy.
In the past few years, almost all airlines have included an �exercises� section in the pamphlet.
As an economy class passenger, I have to laugh at such pictorials � as most of these exercises (see image) are almost impossible given that my knees are already touching the seat in front of me. Now, if only I were a contortionist�
In all seriousness, do you know why these exercises are important?
Studies have shown that many emergencies and future health issues are correlated to inactivity while flying, and one in every 20,000 passengers has an in-flight emergency (source). One serious, yet preventable, issue is venous thromboembolism (VTE) that occurs when a blood clot in a leg vein (deep vein thrombosis or DVT) travels through the body to the lung.
Based on a BBC News report,some 75 percent of air-travel cases of VTE have been linked to lack of movement while in the air. I can sleep well knowing that economy passengers, like myself, are no more likely to develop clots than the more fortunate passengers in business or first class.
While I try to do some sit-ups, lunges, and pull-ups on the plane (kidding of course), it would be great if I knew how likely I was to get VTE or DVT or how much I would have to exercise to minimize my risk of attaining VTE or DVT.
Wouldn't it be cool while purchasing a ticket or at check-in, you would be informed of the health risks on a certain flight? That would be a red eye-opener.
While such a thought may seem like something from a science fiction movie or occur 100 years from now� think again. Predictive disease management actually exists today!
There is a lot of information about a patient that can be used (HIPPA-compliant, of course) to determine the likelihood of disease occurrence or treatment effectiveness.
Based on the study above and numerous other studies, drugs that doctors prescribe are still relatively ineffective.
Doctors do use their best knowledge and experiences, in most cases, but many times they are not utilizing ALL of the information that is available to them when making a decision about the patient. (This is also why some of theIBM Watsonapplications inhealthcareare so interesting to watch.)
This is wherepredictive analyticscomes into play. Predictive analytics software pulls in information from all the disparate data sources, such as from health information systems, Excel, and even from Facebook and Twitter (for those cases you told your friends that last night�s Ethiopian food left you �indeposed�).
The software enables healthcare organizations to transition to a new model and find more effective ways to treat patients and develop new treatment protocols. For example, a predictive outcome could be that Jane Doe has a 95 percent probability of positively reacting to a certain treatment, essentially increasing the quality of care and containing costs.
This is why I�m happy to hear researchers at Hospital Santa Barbara, a research and treatment center in Spain, analyzed patient records and other research data to establish a new, reliable diagnostic model for DVT enabling earlier diagnosis and treatment in high-risk patients. (Learnmorehow Santa Barbara Hospital used IBM SPSS predictive analytics.)
While Spain may have their own economic issues, I�d like to thank them for helping to begin the journey of a DVT-free flight � so I can fly the friendly skies without worrying about my health.
Last Sunday was Father�s Day. This is a paradoxical �holiday� in the U.S., as it is a day to honor fathers with gifts and food, but they are still required to work in the yard, fix stuff, yell at kids and run errands.
I received thoughtful, useful and handmade gifts from my three wonderful kids. They included a converter that lets me play my iPhone through my cassette tape deck in my car (needless to say I�m not driving a 2012 model); a homemade comic strip card about mutant aliens; and, a personalized gum wallet made of duct tape (see picture below).
The real challenge was what to get my father for Father�s Day. In fact, I face this conundrum every gift-giving occasion with my father.
As those of you with fathers can attest, the typical dad has everything he will ever need in his entire life by the age of 31, plus or minus two years. And, I mean everything � tools, gadgets, sweaters and golf paraphernalia.
This personal challenge is what prompted me to use the recently released IBM Analytical Decision Management to provide a recommended action related to my gift selection. My strategic objective was to have my father accept and enjoy my gift.
Because we have been talking a lot about Customer Analytics, Next Best Action and IBM Signature Solutions at this year�s IBM Business Analytics Analyst Summit (search #ibmbas12 on Twitter to follow the commentary), you can understand why I could easily configure my IBM Analytical Decision Management solution. (Hint: Replace �father� with �customer� and �gift� with �offer.�)
Following were the steps to my recommended decision:
�Using years of historical fatherly gift giving data (e.g., ties, golf shirts, jive coupons with the promise of a �car wash�), I restricted the analysis of my data so that the recommended action(s) would be based only on those gifts given in the summer months (e.g., nothing with long sleeves).
�I also opted to exclude �no action� from the recommended action list, which is often a viable decision for retention offers but not for gift giving to my father, especially if I hope to stay in the Peckman will. Just kidding. Sorta.
�I defined the list of the potential recommended outcomes linked to my objective: Give a product, a service; or a combination of the two. Then, I built new business rules and predictive models that were not included since the last time I used IBM Analytical Decision Management. For example, new rules:
If (golf_hndcp[current] > golf_hndcp[lastyear]) & (golf_complaints > 3) then add risk points;
If balance_giftcard > 0 then add risk points;
If (favorite_child[current_month] = me) then subtract risk points;
� and so on.
�Similarly, I created new predictive models:
Before deploying the gift giving decision management solution for use in the field by end users (like me, my wife, my children) I ran all the proper �what if� scenarios and used the new constraint-based optimization functionality in an attempt to maximize enjoyment and minimize effort to carry/use and subject to cost constraints. (To see the other new features in IBM Analytical Decision Management, read the data sheet.)
For example, a new Audi has a predictive acceptance of 100 percent (1.00) but falls outside cost limits for the gift; and, $5.00 tickets to Ballet in the Park (performed by an up-and-coming troupe of back-ups to the back-up dancers) fall within cost constraints, but have a predictive acceptance of less than 2 percent or 0.01667.
By completing all of these steps, �IBM Decision Management for Gift Giving� (the next Signature Solution?) is ready to generate a recommended action to my wife�s question, �What should we get your dad for Father�s Day?�
My recommended outcome >>> Gift certificate to the Olive Garden.
The next step is to put my updated application up into the cloud (read more about Analytical Decision Management SaaS) so my extended social network can run the SaaS version for batch gift recommendations.
And, in case you have any wild ideas, I have a patent pending on the personalized gum wallet made of duct tape.
There was anarticlein The New Yorker last week entitled, �Why Smart People are Stupid.�
Its premise stated, �When people face an uncertain situation, they don�t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions. These shortcuts aren�t a faster way of doing the math; they�re a way of skipping the math altogether.�
Given all the work organizations do to collect and align data, there really is no reason why foolish decisions should be made any longer, especially when there�s a huge price tag associated with bad decisions.
And, when you think about how many decisions an organization makes on a daily basis (thousands, millions?), being foolish is no longer an option � especially calculating the cost between one foolish decision and a million foolish decisions.
And, most of these transactional or tactical decisions need to be made in an instant, such as a customer service agent deciding to give a customer a discount to combat churn; an insurance claims system determining whether a potentially fraudulent activity should be escalated for investigation; or, a logistics manager deciding if a truck is safe to put on the road for the next delivery.
To end this foolishness, IBM has introducedAnalytical Decision Managementto help organizations automate and optimize decision making in real time to ensure the best outcomes occur every time.
Essentially, Analytical Decision Management takes the complexity out of big data by quickly analyzing and embedding analytics directly into business systems (in a call center, on a website, on the manufacturing floor) to empower employees and systems on the front lines with the ideal action.
It also allowsbusiness users to run multiple �what if� simulations, compare the outcomes of different approachesand test the best business outcomes before the analytics are deployed into the operational system. Even analytics follow the old adage, �Measure twice, cut once.�
IBM Analytical Decision Management
According toIDC, the Decision Management software market is expected to exceed $10 billion by 2014. To meet this growing demand,IBM Analytical Decision Management is the first in a series ofIBM Smarter Analyticsinnovations that will change how organizations weave analytics into the fabric of their business, fueling all systems, decisions and actions to consistently deliver optimized outcomes, while adapting to changing conditions.
The newly released Analytical Decision Management combines and integratespredictive analytics, business rules, scoring, and now, optimization techniques, into an organization�s systems to:
�Maximize every customer interaction to grow revenues and increase loyalty
�Detect and prevent threats and fraud in real time to reduce risk
�Proactively manage resources by predicting equipment failure, staffing downtime and service disruptions to contain cost
For example, Santam Insurance is using Analytical Decision Management to transform its claims processing byenhancing fraud detection capabilities and enabling faster payouts for legitimate claims. In fact, in the first four months of use, Santam saved $2.4 million on fraudulent claims. (Readthe full case study.)
Santam can now automatically assess if there is any fraud risk associated with incoming claims and allow frontline claims representatives to distribute claims to the appropriate processing channel for immediate settlement or further investigation, which in turn, optimizes operational efficiency.
As all customers and claims are not created equally, Analytical Decision Managementadapts its recommended actions in real time to accommodate changing conditions as new data is collected and outcomes are recorded.
Analytical Decision Management is also equipped to automatically prepare, cleanse and transform data for the best possible analytics through the newEntity Analyticscapabilities.
There can be challenges when diverse enterprise-wide data is integrated � especially when this data contains natural variability (e.g., Bob versus Robert), unintentional errors (e.g., a transposed month and day in a date of birth), and at times professionally fabricated lies (e.g., a fake identity).
The Entity Analytics feature allows data scientists to overcome some of the toughest data preparation challenges and create the most complete view of an individual record. Users can generate higher quality analytic models and, as a result, organizations will enjoy better business outcomes whether the goal is detecting and preempting risk or better responding to a customer�s needs.
Guest post fromHaytham Yassine, Software Engineer, IBM Social Media Analytics
I�m back with the redesign of the call center complaint process. Click herefor part 1.
Before I share, here are some key areas such a process should focus on regardless of implementation:
�Customer� Today�s customers find value in sharing experiences and advice amongst themselves via social media. Companies should accommodate our preference for these channels and come to us as opposed to us going to them.
�Customer value� Customer value and loyalty is attained by resolving requests in a short encounter with high quality and minimal effort.
�Inputs and outputs� Inputs to the process should simply be the complaint/question, a relevant profile summary of the customer and any CRM data to assist the agent in providing assistance. The output should be quality service along with reference points for future engagements.
�Performance measures� Key measures are customer effort, customer satisfaction, quality of engagement, number and ratio of successful engagements, capacity of the system, channel flexibility and obviously cost.
You will see from the diagram below how most of the issues mentioned earlier can be resolved via a social media solution.
So what are the key improvements to take away from this redesign?
�Reduction of customer effort to a single activity
�Perception of shorter service encounters by pushing most aspects of the process into the pre-encounter phase
�Elimination of duplication by utilizing customer�s social media profile as input, as well as CRM data when available
�Educated (and empowered) agents provide more sophisticated responses by utilizing analytics and suggestions offline
�Proactive quality control integrated into process workflow by incorporating a review activity
�A multiple workstation approach is still employed, where customer requests are distributed across agents
Here�s an end-to-end scenario:
If I have a complaint or question about your product, I�d share my thoughts through a social media channel; let�s say Twitter for simplicity�s sake, but it could be via a blog (similar to the one I�m writing now), board, forum, etc.
Using a social media analytics solution, such as IBM Cognos Consumer Insight, a scheduled hourly query would pick up the post (and many others) and run it through its analytics dictionary and the XYZ-defined model.
Based on geography, demography and other user attributes, the analyzed post is pushed to the designated agent�s backlog.
The agent accesses the backlog from within a reliable social media management dashboard such as HootSuite. The workflow can define the priority in which complaints should be answered, be it the influencer score of the customer, time of request or a combination of both.
The agent sees my post dissected to portray opinion, product mentions and other analytics:
The agent then assesses whether this post is worthy of a response. Maybe it should be addressed by the developer of the EFG application or better yet, maybe it has already been answered by other users in the same social network.
User specific analytics (preferences, prior engagements, etc.) would be brought up to assist the agent in providing the appropriate response. If my profile can be mapped to the company�s CRM, internal data would be loaded as well. The agent would then formulate the response, get it reviewed by their social media manager and then share it.
So how does this implementation fair compared to the current one?
I can�t claim to have done an assessment so I�ll leave it to your company to implement a pilot project and test it out. However, I�ve already proven that quality, effort, capacity, and flexibility are far more superior in the proposed design.
Please ensure you measure successful engagements in absolute and relative terms across the two processes. A reliable social media analytics solution would measure the impact of your engagement efforts over time.
There are also numerous considerations to keep in mind prior to migrating to this process design, most notably, your customers� demographics and their presence on social media.
I do realize that it won�t be easy to get over your call center�s sunk costs. Don�t worry; I�m advising a gradual transition. Pilot this system in parallel.
The cost of a social media analytics solution is mere change compared to the millions and millions you�ve already spent on that call center.
Please let me know if you have any feedback or comments. I would love your input.
Larry P. Ritzman, Lee J. Krajewski, Manoj K. Malhotra & Robert D. Klassen. Foundations of Operations Management (third Canadian edition). Toronto: Pearson
Guest post from Kurt Peckman, Program Director, IBM Predictive Analytics
Last Friday I took a different train into my office here in Chicago.
This particular station has a diner located right next door and within steps of where I would be catching my train. They only serve breakfast and lunch and it immediately hits me that I�ve stumbled upon a diner with an optimized location and manufacturing schedule.
Speaking of which, I had optimized my wait time for my train. No gross surplus of minutes to waste on the platform; no deficit of time causing a heart attack-inducing sprint from my car to the train. I immediately headed to the diner.
The waitress, who I�ve never met before today, immediately greeted me with, �Hi, honey� $1 egg sandwich today?�
I didn�t fall for the �honey� play. I�m old enough to know that any good waitress worth her salt will refer to me as: honey, sugar, handsome, and the like in an attempt to up-sell me from coffee to coffee plus. And given my experience in up-selling myself (discussed in my last blog) I was naturally on guard.
However, I was very, very intrigued by the price of the $1 dollar egg sandwich.
I said, �No, thanks,� which was tough to do. I love egg sandwiches and one dollar is a heck of a deal for a diner-based product. (Notice the use of the word �deal� and not �price,� which implies �value� to me.) I am trying really hard not to eat so many egg sandwiches so I declined. But, the critical fact in this story is that I paid $1.75 for a cup of coffee.
Secretly, what I really wanted to do was take the entire day off of work to interview �Flo� the waitress (my customer service rep), the chef, and other patrons about the implications of the $1 egg sandwich. I especially wanted to interview the owner (who I think was sitting in the corner reading a paper) as to how the execution of the egg sandwich is tied to his overall business strategy.
How was that price determined? Is it an optimized price? Can a diner really make a profit on a $1 egg sandwich? If so, does it include the cost of all goods: materials, labor, overhead (e.g., utilities, wear and tear on the grill, depreciation on the spatula, etc.)?
Or was the pricing objective pull marketing for the diner? The deal didn�t prompt me to go into the diner, and I�m not even sure there was a sign out front stating the terms of the deal. But, there was signage inside that I realized only after she pitched the deal. Now my mind was spinning.
Is Friday the best day for the egg sandwich promotion? Is this an optimized campaign � right offer, price, channel, day and time? I didn�t even get a chance to ask if every Friday is a $1 egg sandwich day. If so, I might be inclined to invite my colleague Bob (who regularly commutes to/from this station) about the end-of-week-deal at this diner.
Given my love of egg sandwiches, I might even be tempted to take to social media to sing the praises of this diner.
Other questions scrambled my mind: do they pre-make the $1 egg sandwiches? They must. There is no way the diner can meet the short-term, burst demands dictated by the average time one waits for a train.
And what is the optimized inventory of egg sandwiches that minimizes spoilage and maximizes freshness, demand, labor�? The $1 egg sandwich production quickly becomes an n-dimensional optimization problem.
And by �optimization� I mean the mathematical definition: maximizing (or minimizing) some outcome or value within a set of predetermined constraints. A classic example is an investment portfolio: we are all trying to maximize the value of our portfolio subject to the constraints of contributions, time, risk, market direction, etc. But I digress� back to the eggs.
Maybe the $1 egg sandwich starts at $2 earlier in the day and, by the time I arrived, the decision was made to drop price due to surplus inventory. Wouldn�t it be something to find out that a mom & pop diner was using sophisticated optimization algorithms to price egg sandwiches that maximize profit and minimize spoilage?
At this point three things become apparent:
1. Tying strategy to execution is as critical to the mom & pop diner as it is to Global 100 companies;
2. The best decision management solutions must include an optimization component; and,
3. I have an unhealthy obsession with egg sandwiches.
This semester I have a new challenge. In my Operations Management class (MBA5280B), I was tasked with an assignment to analyze a process and improve it.
I�m not a big fan of call centers and I firmly believe businesses should stop imposing their traditional models of service and start utilizing market-driven media of conversation for their business processes.
Just last month, I read arhetorical blog posted by Brian Solisfrom Altimeter Group. It really intrigued me, particularly because theExtreme Blue(IBM's internship program for students pursuing software development and MBA degrees) project we planned out for the summer heavily addresses this space.
Solis� blog helps highlight, in an exaggerated fashion, the frustrating traditional process of reporting a product or service complaint. I highly recommend reading this post as it provides a great introduction to the process reinvention I�m putting forth.
Not only that, the format I�m adopting in my blog takes the shape of a response to the original �Dear customer� tagline.
Although I would love to explore numerous processes that could be improved by using social media analytics, I will limit this article to the following: analyzing the call center process for reporting a product complaint and improving it by transposing it onto a �smarter� social media engagement workflow.
Here we go�
Dear customer relations manager at (fictional company) XYZ,
I am writing to express my dissatisfaction, not with your products and services, but with the process you employ for people like me to voice their concerns about these very products and services. You emailed me recently asking that I go through the standard call center process for reporting a complaint or asking a how-to, and here I am instead going through the �very standard� social media channels.
Why, you ask?
It�s because I�m one of the millions of consumers out there who have grown fond of using social media for gathering buying decision information and venting experiences and reviews in return. It�s a vicious cycle, you know. Oh, and by the way, I hope you have a reliable social media analytics solution in place to pick up this blog; I won�t be picking up the phone and calling your hot line.
Your process is not only inefficient and painful, but it�s also missing on countless opportunities in the social media space. I�ll start off by analyzing the current process. Below is a simplified chart that highlights the various activities involved.
The process can be categorized as a service shop, where high customer involvement meets moderate-length service encounters and immediate delivery is expected at the end of the process.
I�ve highlighted in red the most problematic activities in the process chart and, as you can see, most of the complexity lies in the customer space. You�ll notice I�ve excluded transfer activities from the chart to make things simple; in reality, these do exist and they add to the encounter�s length and the customer�s frustration.
Here is a list, by no means exhaustive, of the issues I see with the current process:
� Increased customer effort in various activities, complicating the interaction
� Customers have to actively wait in queues for service, extending the service encounter
� Duplication and inefficiency � same inputs being requested/processed at multiple phases
� Multiple transfers may be required
� Potential peak capacity concerns
� Under utilization during low periods
� Agents pressured to provide quality output based on unknown inputs on the spot
� Lack of proactive quality control on process output
� Lost opportunities in customers hanging up
Of course, like a traditional call center, I�m not here to just offer my complaints. I do have a solution.
Please stay tuned for part 2 of my article when I describe the suggested process redesign.
Guest post from Kurt Peckman, Program Director, IBM Predictive Analytics
About a month ago I moved.
I closed after lunch on a Friday afternoon. The only reason that is relevant to this story is the timing: my cable provider called me the next day � Saturday morning around 9 a.m.
I knew it was my provider, thanks to caller ID. Granted I�m not that old, but not too long ago you had to actually answer the phone to know who it was. In fact, I now have a phone that will announce out loud who is calling me. Ah, technology.
Being a Wisenheimer, I answered the phone not with a �hello,� but with, �I bet you are calling about the sale of this house.�
Without missing a beat, the customer representative answered, �Yes I am, and I�d like to get you the best possible package for your new house.� Note the use of the word �best.�
Thus began my willingness to be retained.And, at the time I wondered to what extent predictive analytics were being used to �retain� me during the conversation.
Because �best� was enough to get my attention, I let him ask me the location of my new house. He was quick to pull it up and confirm the deal he had in mind could actually be pitched.
�Yep, looking at your location, I can get you set up with the following package at [about half of what I was paying before!].�
Here is the critical fact in this story: the �package� he pitched included internet connectivity speeds at 2X-3X what I had before the move AND a television package that was two upgrades above what I was leaving. All for half the price I was paying before the move. Too good to be true?
Efficient retention. Impressive.
As someone who has held sales positions, works in predictive analytics, and has a technical background, I could really appreciate the efficiency of this win-win transaction. My provider retained me as a customer on a Saturday morning with a single 10-minute phone call AND my new house will have quadruple the package of the previous house for half the price.
Hold on. It gets more impressive from the telco�s standpoint.
Then I had a revelation. After only two weeks in the new house enjoying my new services (key word �my,� read �personalized�), I figured out that if I paid more than I am paying now � but not much more than I used to pay in the old house � then I could have the top-of-the-line package: super-duper connectivity, high definition, DVR, and on and on.
That is to say, I just up-sold myself as a result of a 10-minute phone call on Saturday morning four weeks prior!
Needless to say, my telco provider must be leveraging elements of a robust Decision Management solution. In particular, I�m sure they used my high predictive score for up-sell, coupled with the business rules that governed the initial offer, such as�
�IF (provider_jump = false) and,
�IF (previous_package = XYZ ) and,
�IF (number_complaints < 2) and�
�to produce an outcome that demonstrates the importance of predictive analytics and rules to guide optimized and automated decisions.
Said another way, my telco provider not only retained me, but got more monthly subscription revenue out of me in a very efficient manner.
And this is just one personal example from telco. Think of how predictive analytics and rules can (and are!) being used in tandem to optimize and automate recommendations in retail (e.g., customer analytics), manufacturing (e.g., preventative maintenance), insurance (e.g., claims processing), and beyond.
Speaking of optimization, stay tuned for Part III of my Decision Management series.
And, if you missed my �Ode to Rules� in Part I, you can read it here.
Guest post from Kurt Peckman, Program Director, IBM Predictive Analytics
Rules are meant to be broken.
No one likes restrictions, to be controlled or be told what to do. In reality, however, rules are broken so better, stronger, and more appropriate rules can be created.
In other words, an established rule is often a starting point (or some critical point) of a rule�s �evolution.� Good rules evolve so better actions and decisions can be made.
For instance, some rules are about governance. Traffic rules govern some of the largest, most complex systems in the world. Motorists are surprisingly (mostly) cooperative thanks to these �rules of the road.�
And, what�s even more interesting is the apparent global rules of the road (that apply everywhere) in contrast to parts of the world have �local� rules � due to geography, culture and necessity. For example, making a right-hand turn in the United States is very different then Australia�s �hook turn.�
Rules are also about policy. For example, never go in your mom�s purse, never call someone after 9:00 p.m. or before 9:00 a.m. (Yes, I�m showing my age. I realize that nowadays we text each other 24/7).
Speaking of texting, ALL CAPS � as a rule � means you are screaming at someone. Oh, and never, ever text an image that will come back to haunt you later. Don�t be a Weiner. When you are on the golf course, there�s a rule that you shouldn�t talk about business before the 3rd or 4th hole � and try to finish up by the 15th or 16th.
I once had a psychology major tell me the vast majority of interpersonal behavior can be explained by two rules: birds of a feather flock together and opposites attract.
Think about the rules that apply to reviewing and selecting candidates for a job opening from hundreds of applicants, quickly building a large world-wide team for a last-minute project, or even during a round of speed dating.
These show the fine line between governance and policy and demonstrate how �rules� become important in guiding decisions. Specifically, they become a necessary component of a Decision Management solution �especially when the volume of decisions increases and the time to make decisions dramatically decreases.
Decision Management allows users to automatically deliver high-volume, optimized decisions at the point of impact, such as in a call center, on a website, in a store, etc.
Overall, rules help link day-to-day execution to organizational objectives. Consider sports. Every rule book for every sport has a catch-all rule that enables an official to make a �judgment call.�
In basketball a referee has discretion when determining if someone is being malicious on a foul. There are criteria (e.g., a set of rules) to determine if a foul is flagrant � was the player really going for the ball, did the foul seem to have the right balance of aggression and sportsmanship, was the foul committed during a breakaway.
Finally, let�s discuss gaming. In casinos, each game has its own set of rules. I like this as an example because there are global rules about gambling (the house always has the edge) and local rules (in the US you have to be 21 years old to gamble). The local rule in my house is that I always win.
Consider all the systems I mentioned � traffic, sports, gaming � and consider the complexity of these systems, then think about how a good set of evolving rules helps establish structure, policy and governance.
But, rules can be inflexible and limiting to good decision making. Decision Management solutions must have rules, but they also can�t rely entirely on these rules. After all, a good process might be bad if it speeds up bad decisions or outcomes.
Rules must be balanced with business analytics for optimal decisions. I�ll cover that in part 2 of this discussion.
By the way, what is your favorite rule that you like to break?
Guest post from Erick Brethenoux, Director, IBM Business Analytics & Decision Management Strategy
As the saying goes, �Rugby is a rough sport for gentlemen; football (or soccer) is a gentle sport for ruffians.�
When I played rugby in my younger days in France, I suffered a number of injuries � from a dislocated shoulder to being knocked out to various gashes requiring stitches on my chin and head.
It�s no surprise that a study shows 1 in 4 rugby players will be injuredduring a season since the objective of the game is to take a hit for your teammates and keep the ball moving down the field.
In order to find new ways to keep top players healthy, the Leicester Tigers, nine-time champion of the English rugby union�s Premiership and two-time European champion, are using IBM predictive analytics to help the team better understand and reduce players' injury rates and minimize risk.
After all, losing a key player for an extended period of time can not only hurt the team on the field, it can also result in reduced ticket sales and spectator attendance if the team does not perform up to expectations.
Leicester is looking at important indicators such as fatigue, and threshold and game intensity levels in order to detect hidden patterns or anomalies. Better understanding this information will allow coaches and trainers to prevent injuries for each player by investing in adequate training programs, tailored to players� physical and psychological states.
For example, if a player has a statistically significant change in one or more of his fatigue parameters and the current intensity of training is likely to be high, the data may show that the likelihood of this player becoming injured is 80 percent greater. This type of real-time information will make it possible for the team to alter the player�s training to reduce the injury risks.
Predictive analytics also allows Leicester to analyze psychological player data to reveal other key factors which may affect performance, such as stress around away games and social or environmental elements that could significantly change the way players perform during a match.
In the manufacturing industry, plant managers, maintenance engineers and quality control champions all want to know how to sustain quality standards while avoiding expensive unscheduled downtime or equipment failure, and how to control the costs of labor and inventory for maintenance, repair and overhaul operations.
Through the use of IBM predictive analytics, they can now gather information in real time from a variety of sources, including maintenance logs, performance logs, monitoring data, inspection reports, environmental data and even financial data to determine the areas of greatest risk.
For example, an IBM customer who manufactures helicopters is able to identify and predict equipment maintenance, ultimately increasing customer satisfaction by keeping the helicopters in the air instead of grounded for repairs.
It�s the same way that Leicester is investing in business analytics to uncover the key predictors in the data �scrum� to deliver personalized training programs for players at risk and improve performance.
Guest post from Kurt Peckman, Program Director, IBM Predictive Analytics
Spoiler alert: If you have never seen �2001: A Space Odyssey� forgive me for spoiling the plot, but you�ve had 44 years to see the movie.
In 2001, a space crew was voyaging to Jupiter along with HAL 9000, the spaceship�s computer that wasfoolproof and incapable making poor decisions.During the journey, HAL began to malfunction, slowly go mad, and refusing to cooperate, turned on his crewmen and methodically �eliminated� them one-by-one.
Ultimately, HAL had to be powered down � against his own will � to keep him from making any further decisions on his own.
IBM has learned a lot from HAL in the last 44 years. For instance, one can�t leave all the decisions to the HAL 9000 (or any other operational system, business process, platform or person), but can leverage the combined strengths to enable sound decision making.
And what are those strengths?
Taking information from everywhere (and I do mean everywhere � transactional, social media, call center notes, video, sensors, etc.) with an end goal of providing recommendations for action, such as identifying claims fraud, reducing churn or reducing costs via preventative maintenance.
More specifically, organizations can now employ these systems in the Cloud using all of its proprietary �local� data along with cloud-resident data to tie strategy to execution by means of decision management systems.
By combining predictive models, rules, scoring and optimization techniques to generate recommended actions, decision management systems allow users to automatically deliver high-volume, optimized decisions at the point of impact, such as in a call center, on a website, in a store, etc.
As an additional option for customers, IBM recently launched IBM SPSS Decision Management Software as a Service � one such system that helps organizations make these decisions in the cloud without the administrative overhead and expense of on-site software.
Decision management is an ideal solution for organizations in a range of industries, especially those with high volumes of interactions � such as in retail, banking and financial services, and insurance, as well as government agencies and academic organizations.
For example, some decisions and recommendations will be heavily dependent on rules (e.g., �do not make offer A to customer B��), while others will be based on predictive analytics (e.g., �� unless the propensity to churn is greater than 90 percent...�). Some decisions and recommendations will be based on internal data (e.g., past purchase patterns and RFM analysis), and others on external sources (e.g., credit score and tweeter feed).
The main point, however, is that all are tied to generating a specific outcome, whether a tactical or strategic decision. And, even before deploying these recommendations into an operational system, multiple simulations and �what if� scenarios can be run to compare the best outcomes.
Let�s get the recommendation right first so the same bad decisions aren�t made over and over again.
If HAL taught us anything it�s that the outcome is king. It�s time to start deploying analytics into operational systems before customers start being methodically eliminated�one-by-one.
As the old adage goes, �Fool me once, shame on you; fool me twice, shame on me.�
If you�ve never heard this expression before, it means that the first time a negative action occurs, the accountability is on the one that did it. When it happens a second, third, or fourth time, the accountability is on the individual (or group) that allows it to keep happening.
For instance, in Tennessee in 2009Amanda Sue Kelley, 19, was arrested seven times on charges that ranged from drug possession to domestic assault and theft. Her last offense, police say, she wrenched open the door of a parked car, pointed a gun at a woman changing her 13-month-old daughter�s diaper in the back seat, and demanded cash. (Source: The Tennessean)
This is such the case of the existing criminal rehabilitation system. Those being fooled are the taxpayers that are allowing their money to be spent on an inefficient and ineffective system. A bold statement, maybe, but let me explain�
Repeat criminal offenders cost the system more money than one time criminals, and it wasreportedthat more than 40 percent of offenders in the U.S. return to state prison within three years of their release. In the UK, onestudyof 14 prisons, most of which hold short-term inmates, found reconviction rates of more than 70 percent, and according to anothersource, these criminals are committing up to 2,000 murders, rapes and other serious offences every year just months after completing a punishment for a previous crime.
The term for this is recidivism � which refers to measuring the rate that criminals violate their parole or are arrested for new crimes. In fact, theU.S. Justice Department estimatesthat a 10 percent decrease in recidivism can generate a collective savings of $635 million. And more importantly, one less repeat criminal on the street is, at minimum, one less crime committed in our neighborhoods.
What if there was a way to anticipate which individuals were likely to become repeat offenders after they commit their first crime? Or, what if high risk individuals can receive the appropriate attention or be placed in the rehabilitation program that is best suited to change their unlawful path?
Government agencies worldwide that are responsible for public safety are already usingBusiness Analyticssoftware to analyze data on criminals to provide insight into complex relationships and patterns, such as past offense history, home life environment, and gang affiliations among others, to better understand and predict which inmates have a higher likelihood to reoffend.
For example, theU.K. Ministry of Justiceneeded a way to analyze vast amounts of crime and offender data to understand which proactive measures would be likely to prevent recidivism. The ministry turned toIBM SPSS predictive analyticsto analyze millions of prisoner files. The analysis is helping them develop treatment targets for prisoners throughout their sentence to reduce the probability they will commit crimes upon their release.
The Ministry of Justice now has more accurate crime prediction rates with violent crime recidivism prediction improving from 68 to 74 percent, and general offenses recidivism prediction improving from 76 to 80 percent.
Yes, it is a complex situation. Yes, politics comes into play. And no, this is nothing like the �precogs� in the fictional Tom Cruise movie, Minority Report.
However, predictive analytics is proving to be highly effective by helping organizations like the Ministry of Justice take measures to ensure that inmates receive the best services tailored to meet their rehabilitative needs, while being proactive to stop future crime and better protect citizens.
Another old adage says, �One definition of insanity is doing the same thing and expecting different results.� Let�s try a smarter solution to achieve better results.