## Issues Are Not Where One Think They AreWhere are the issues when one tries to use optimization to improve business? They may not lie where one think. My former colleague Laurent Perron (now at Google) splits the average time spent on optimization projects as follow in his CP 2011 invited talk:
One could argue about the exact split, but the broad picture is true as far as I can tell from my experience. I would summarize it as follow:
- Finding how and where to apply optimization is a complex task
- The mathematical optimization part is a tiny piece of the overall solution.
The first item is complex because it requires a mix of business skills, technical skills, and social skills. You need to be able to map business problems to optimization problems amenable to solution using the tools you know. And you must be able to explain this to people who aren't interested in how you solve the problem. This is where skilled practitioners make a difference. It would be worth discussing this topic further, but I will focus on the second item for today's post.
My take on the second item is that if you only use problem solving tools then you lose time on the rest. In order to be efficient when developing an end to end solution, you also need tools that deal with data, and tools that deal with how business people can look at results, understand them, and act upon them. Let's look at what such tools could be. Getting clean data
Before getting clean data you must get data.. This may sound obvious, but it is often underestimated!
Getting data may be very complex when it does not exist before your project. In such case you need to define how to store it and how to collect it How to do this will depend heavily on your project, on the legacy IT environment you work in, and the kind of data you need. The good news here is that with the big data hype, there are plenty of tools and companies ready to help collect data of all kinds. Once you have raw data the interesting part begins (well, interesting to me at least) : you need to look at it.You need to look at data in order to evaluate its quality. Quality will need to be improved if not good. Data needs to be of good quality because optimization is a technology. If you have wrong data then you get useless answers. For instance, if you have inexact description of manufacturing orders, then you get useless production schedules. If you get inexact inventory level then you get useless replenishment schedules. If you get duplicate entries in airline crew lists, then you assign the same person on two different planes at the same time. Etc.garbage in, garbage outSo, the next thing to do when you get data is evaluate its quality by looking at it. The good news here is that we can use tools developed for predictive analytics. I will refer the reader to an excellent presentation by Irv Lustig for an overview of pivot tables on steroids.
Last, when you have data, and when you know what quality issues it might have, you can clean it. Our preferred way to do it is to use business rules. Business rules are ways to automate policies using human readable IF THEN statements. For instance, you can use rules to flag for duplicates. Or you can use rules to check integrity constraints. Let me give an example for the sake of clarity. I was working on a crew assignment problem for a large airline. They had complex labor rules, especially on the need to have rest periods interleaved with actual work. It turned out that a significant fraction of employees had preassigned tasks (training mostly) that violated these constraints. It was decided to write rules that would filter these employees and have them removed from the pool of employees we were dealing with.
Business rules aren't the only way to achieve this, but they have one advantage: they can be read by business people. This helps on the second time consuming task, which is to convey results to the business. You can explain the assumptions you used in your data by showing the actual business rules that you used.
Solving the problemEven if this is labeled as being easy by Laurent, there are a number of pitfalls that need to be avoided, see for instance What Is The Solution When There is No Solution, or Making Dreams Come True (Or Not). The tools of choice here are modeling environment and good optimization solvers, for instance our own ILOG CPLEX Optimization Studio. You could also use tools from other vendors, but I'd be killed by my marketing colleagues if I was citing them! Reporting the results/Explaining the implications
The last step is about translating the mathematical solution into something actionable by the business. It is as important, if not more, than all the rest. Indeed, what's the point of producing very good mathematical solutions if they aren't acted upon?
The issue here is that the mathematical problem that was solved in the previous step is always some abstraction of the real business problem. It is expressed in different terms (mathematics vs business), and it leaves out some of the real problem. It means we have a twofold problem to solve: translate mathematical terms into business terms, and deal with the parts that weren't taken into account.
The most effective way to convey results is to use nice graphics. This is well documented by Robert Randall in his blog post. You can also read Nice
There is more to graphics however. What is really valuable is interactive graphics, where a business user can act on the proposed mathematical solution. A business user may want to edit the solution in order to take into account some reality that didn't make it into the mathematical model. For instance swapping two person in a train crew schedule because of personal incompatibilities (true story). Or moving some scheduled task to another machine because operations reported a possible maintenance check. Then the system must restore the feasibility of the solution if it was broken by the edit. This can be achieved by resolving a much simpler problem that aims at minimizing change compared to the original solution show to the user. We call this the solve anyway feature.
A related, and even more important feature, is the ability to produce several solutions with varying assumptions. It can be used for marginal analysis by answering questions such as: what would the service level become if I use one extra person? What if I could use an extra plane to serve my customers? What if analysis and the ability to graphically compare variants of the same problem and their solutions (scenarios) is extremely well received by business users. It enables them to fully grasp the solution space and make the right trade offs for their business.
Another use of scenario comparison is when there are multiple objectives. For instance, an investor has two competing objectives: maximize return while minimize risk. Another example is replenishment planning for retail, where one wants to minimize the risk of sold out items while minimizing stock level. In each case, improving the value of one objective degrades the other objective. One way to handle this would be to combine the two objectives into a single, composite objective. Solving the resulting mathematical problem yields one solution that optimize the combined objective. A much better approach is to let the business user explore the solution space. For instance the portfolio manager sets a maximum risk level and optimizes the return. This give an optimal asset allocation for that risk level. Repeating this for various risk levels provides a set of asset allocation with different (return,risk) pairs. Then the business user (the investor in this case) can select which looks better to her. Formally, this amounts to explore the set of pareto optimal solutions.
Reality check
Is the above wishful thinking or not? Fortunately it is becoming reality. Several software vendors are moving from problem solving tools to larger scope software platform. For instance, we recognized several years ago the need to be able to explain results to business users effectively. We have tried to implement the required features in a tool called ILOG Optimization Decision Manager Enterprise (ODME in short), that complements CPLEX Optimization Studio with graphical business views, scenario analysis, solve anyway, and data server. This tool is certainly perfectible, and we keep improving it. But is is effective in shortening the time spent once we know how to solve a given problem.
Let me give one example. The leading Spanish electricity grid company, Red Electrica de Espana, is planning its electricity production. They claim to save between 50,000 and 100,000 euros a day compared to their previous way of planning production. How long lasted the IT project that led to this significant return? Well, the full project lasted no longer than 6 weeks, from the start of development to the deployment of the application. This very short time to market was made possible thanks to ODME that provided out of the box business interfaces and integration hooks These short time to market would not be possible using tools that only deal with the problem solving part. |