WebSphere Business Modeler Advanced Simulation

IBM WebSphere Business Modeler® (hereafter called Modeler) is a powerful tool for creating business process models. It lets you simulate models to understand the dynamic behavior of the business process. This article gives an overview of advanced simulation features. Basic knowledge of running simulations is required. This content is part of the IBM WebSphere Developer Technical Journal.

Marc Fasbinder, Consulting I/T Specialist, WebSphere Software Technical Sales, IBM

Photo of Marc FasbinderMarc Fasbinder is an I/T Specialist at IBM with the WebSphere Technical Sales team in Southfield, Michigan.


developerWorks Contributing author
        level

22 August 2007

Introduction

One of the most powerful features in WebSphere Business Modeler is the ability to simulate business processes. Creating business models lets you capture business metrics, such as the times required to perform given tasks, the resources involved, and the costs. The simulation uses these metrics to understand the dynamic behavior of the business process. For example, you might want to identify where the costs are in a process, and where there are bottlenecks, or resource shortages. Simulations let you discover these sort of dynamic behaviors.


Simulation basics

In Modeler, you can create a process model with tasks, decisions, and other process elements. After laying out the elements of the process, you can add documentation, as well as business metrics. For example, you can indicate which resources a given task uses, and how much of their time it requires. You can document costs for the task along with costs for the resources. You can then perform static analysis, to understand the “non-moving” parts of the process.

To understand the dynamic behavior of the process, you need to imagine what would happen if multiple instances of the process were actually running, using the resources for an organization (both people and equipment). If ten requests came in at once, but there were only three people available to perform the work, the other seven requests would have to wait for a resource to become available. You can only understand these dynamic aspects of the process if you look at the work flowing through it. For a process with more than a handful of tasks, the mathematics would become quite complex. A tool is clearly needed to assist in this area.

As a process uses resources and performs tasks, there are associated costs for the resources such as their salaries and material costs. Some of the tasks might also generate revenue. At a business level it is important to understand these costs to define the payback for a process. You need to understand the time spent on a process, including the time the resources spend on their tasks, along with the times associated with waiting for work. All of this data needs to be collected for the business to understand how the process works. The process has to be understood before it can be improved upon.

To achieve these goals, Modeler includes a powerful simulation engine. To simulate a process, you only need to define a few things:

  • Which resources are associated with tasks
  • The time they need to spend doing their work
  • The estimated percentages of when a process flows through each decision branch.

However, if you want to perform more detailed reporting, adding in more attributes such as the costs, and details about the organizations allows you to exploit more of the predefined reports.

The simulation engine shows you how the process would actually run. Instead of taking days to complete, the simulated process can run in mere seconds. This means you can quickly view the results of a process that in real time would take much longer, for example weeks or even a month. You can also make changes to rapidly perform “what if” analysis on the process.


Importing the demo samples

I’ve included a demo with this article that contains several samples. You can download these and import them into Modeler Advanced to follow along with the article.

The Modeler Archive (MAR) file simulation-demo.mar contains the processes for the demo. To import it into your workstation:

  1. Right click on the Project Tree, and select Import…
  2. The WebSphere Business Modeler project (.mar, .zip) is selected by default. Click Next.
  3. For Source Directory, click Browse… to navigate to the directory where the .mar file is located.
  4. Select SimulationDemo-final.mar.
  5. For Target Project, click New.
  6. Enter a project name, then click Finish.
  7. Make sure Include Simulation Snapshots is checked, then click Finish.
  8. When the Overwrite dialog box appears, click Yes to all, as Figure 1 shows:
    Figure 1. Overwrite dialog
    Figure 1. Overwrite dialog

Important: Windows has a limitation of 256 characters for a file name and path. Because this demo uses folders within folders, and Modeler creates subfolders for the metadata being stored, you might get a message that Modeler cannot import the .mar, because a file name is too long. If this is the case, you will need to use a workspace that is closer to the root, and/or use a shorter name for your project.


Process case studies

There are three sets of process case studies included in the sample. The first set uses small process fragments to illustrate several different process improvement patterns. You will see how to identify the problem in the process, how to correct it, and how to analyze the improvements in the process. I provide both As-Is (current state) processes with the flaws, along with To-Be (future state) processes with the improvements in place.

The second set of processes contains studies of several advanced features for understanding how simulations work. Often times, it is easier to learn from a working example, rather than having to build it from scratch.

The final set of processes contains full-blown examples. These processes show a full end-to-end scenario, as opposed to the snippets in the first set of cases. These are intended to be more realistic examples of processes for a business.

Together, using these sample processes, you will learn how to run a full simulation demo to understand many of the most important features and functions for this capability of Modeler.

For each of the case studies, I’ve already run the simulations. You can use the existing simulation results for dynamic analysis, or you can run the simulations again yourself. Each simulation profile is already set up to run multiple instances of the process. In most cases, the setting is for 50 tokens in total. Using a larger number of tokens results in a higher accuracy of the simulation. If you ran two instances of a simulation with a decision set to 50/50 for the yes and no path, it is possible that both tokens could flow to the same output. Using a larger number of tokens, it is far more likely that the total numbers will be much closer to the specified 50% for each output. Simulations with larger numbers of instances reduce the chance of any random anomalies such as this skewing the results. It is recommended to turn off animation so that the simulations run faster.

The resource pool is also set up in each simulation profile. If you create a new simulation profile for one of the examples, you will need to set this up as well.


Process improvement patterns

This section covers a set of process snippets you can use to understand several common process flaws in an As-Is process, along with ways to do things better in a To-Be model. For each pattern, I show the appropriate reports to run from dynamic analysis. The patterns are numbered to match the examples in the .mar file. This section shows how you can use simulation to identify problems with As-Is processes, and how to show improvement comparing them to the to-be version using simulation and dynamic analysis.

1.1 High cost path

A process may contain multiple possible paths of execution. Sometimes, one path is far more expensive than other paths. If the percentage of cases that flow through the high-cost path can be reduced, it can result in significant cost savings.

Figure 2. As-Is process with high cost path
Figure 2. As-Is process with high cost path

In the As-Is process in Figure 2, the Log Request task is automatic and places the request into a database. If validation is needed, the process flows to a manual Validate Request task. The paths then join up, and process the request. This is a simplified example, but it illustrates the pattern.

Figure 3. To-Be process with automatic validation
Figure 3. To-Be process with automatic validation

In the To-Be version of the process (Figure 3), a new task attempts to automatically validate the request. Some cases still require manual work. However, the percentage for that path in the process is reduced. The path with the higher cost will be run less often, resulting in process savings.

Reports to run

Dynamic Analysis - Process Cases Analysis - Process Cases Summary – From this report (Figure 4), you see that the path with manual work takes longer and costs more.

Figure 4. Process cases summary
Figure 4. Process cases summary

Dynamic Analysis - Process Comparison Analysis - Process Cost Comparison – From this report (Figure 5), you see that avoiding the high cost path saves money.

Figure 5. Process cost comparison
Figure 5. Process cost comparison

1.2 Email to human tasks

Many businesses have processes that are based on email messages flowing from person to person. Often times a spreadsheet is attached where users can input data and approvals, before sending the email on to the next person.

There are many problems with this approach. For example, Simple Message Transfer Protocol (SMTP) e-mail is not a guaranteed message delivery mechanism. There is no tracking, so it is impossible to know what state a process is in. If the person you send the email to is not available, there is no way for an administrator to reassign the work to someone else. If the email is in Bob’s inbox, it will sit there until he opens his mail. If you send a new email to someone else to do the work because Bob is not available, when he returns, he will still have the email and might try to process the request, even though the work is already done.

These are just some examples of why an email based process is an anti-pattern, and should be avoided.

Figure 6. As-Is process with manual steps
Figure 6. As-Is process with manual steps

In the As-Is process (Figure 6), there are three levels of approval, all manual steps. The work flows from person to person based on emails. When a person receives the email, they must detach the spreadsheet, edit it, and save it. They then create a new email, attach the updated spreadsheet, and send the email to the next approver in the chain.

The To-Be process looks just the same, except that workflow is applied, rather than using email. All of the needed information is presented to the user, eliminating the need for the spreadsheet. With less manual work, less time needed for each task. Based on this, the process becomes more efficient.

Reports to run

Dynamic Analysis - Process Comparison Analysis - Process Cost Comparison – In this report (Figure 7) it is clear that even with very conservative numbers, there is a significant reduction in the cost of the process. You should also factor in the other intangible benefits such as being able to monitor the process state, transfer work from one user to another, or even assigning work to a group. Simulations can only look at the tangible benefits to the process.

Figure 7. Process cost comparison
Figure 7. Process cost comparison

1.3 Manual approvals to rule-based approvals

Often times in a business process, an approval is required before the request can be processed further. Many times, the As-Is process includes a manual step for a person to do the approval, because that is the way it has always been done. However, if an automatic business rule could serve as an approval mechanism even a percentage of the time, then significant costs savings could be realized.

In the As-Is process (Figure 8), the step to perform the approval is manual.

Figure 8. As-Is process with manual approval
Figure 8. As-Is process with manual approval
Figure 9. To-Be process with automatic approval
Figure 9. To-Be process with automatic approval

In the To-Be version of the process (Figure 9), the Approve Request step is performed using a business rule rather than a person.

Reports to run

Dynamic Analysis - Process Comparison Analysis - Process Duration Comparison – From this report (Figure 10) you can clearly see that by removing the manual work, the requests can be processed many times faster (and with a lower cost).

Figure 10. Process duration comparison
Figure 10. Process duration comparison

If the person is not adding value, but instead merely checking the request and doing things that could be automated, the time and costs savings make this process improvement a must.

1.4 Manual lookup to automated service

Many times a business process needs to access data residing in a business system. For example, you might need to verify data or add data from the request to the customer record. In some business processes, this work is done manually by a person. For example, they might log on to a “green screen” application, check the data, and then continue processing the request.

However, when a person is performing work like this, they are not adding value. They are doing work that could be done much faster (and with higher accuracy) by using business process automation.

Figure 11. As-Is process with manual lookup
Figure 11. As-Is process with manual lookup

In the As-Is process (Figure 11), the Lookup Customer Data step is manual. A person logs on to a legacy application to find information to add to the customer request record before it can be processed. This activity would be an ideal candidate for a service. In the To-Be version, an automatic activity performs the lookup.

Reports to run

Dynamic Analysis - Process Comparison Analysis - Process Duration Comparison – In this report (Figure 12) you see a clear improvement in the process time, based on automating the approval.

Figure 12. Process duration comparison
Figure 12. Process duration comparison

1.5 Manual process to automatic process

In some cases, all of the steps in a process are manual. Often it is left that way because that is how the process has always been done, or the business has never felt comfortable about a computer doing the work. By automating the process, there can be significant cost and time savings, making it very worthwhile.

Figure 13. As-Is process with manual steps
Figure 13. As-Is process with manual steps

In the As-Is process (Figure 13) the work is done manually. A classifier shows that while each of these steps is manual, the business analyst believes they are all potential areas for workflow to be applied. In this particular process, a paper document flows between steps. It will take some time for the paper document to be delivered, so the task duration is set to 20 minutes, while the resource duration is set to 15 minutes. This leaves time in each task for the manual delivery of the paper.

Figure 14. To-Be process with automatic steps
Figure 14. To-Be process with automatic steps

In the To-Be version of the process, the classifier changed to show that the process now applies workflow. The processing time is smaller because the workflow presents the user with all of the information they need to perform their jobs, reducing the time for the task. The information for the task is now delivered electronically, eliminating the delay waiting for the paper document.

Reports to run

Dynamic Analysis - Process Comparison Analysis - Process Duration Comparison – In this report (Figure 15), you can see that the To-Be process is twice as fast as the manual process.

Figure 15. Process duration comparison
Figure 15. Process duration comparison

1.6 Remove bottleneck

Some processes contain bottlenecks. For example, if there is a shortage of resources at one particular step, even if the rest of the process is infinitely fast, the work will still back up at the bottlenecked step. Identifying and removing the bottleneck from a process can have a large impact on the time and cost for the process.

Figure 16. As-is process with bottleneck
Figure 16. As-is process with bottleneck

In the As-Is process (Figure 16), after automatically validating the data, a high-speed printer prints a physical check. The Print Check task takes just one second of the printer’s time. There is a five second activity duration to account for the serial transfer of data to the printer. However there is only one printer, so when the simulation runs, you see the tasks queue up in front of the Print Check task. In Modeler, if you set the maximum simultaneous tasks to 1 for the Print Check task in the simulation profile, the incoming tokens queue up waiting for the printer to become available.

Figure 17. To-Be process without bottleneck
Figure 17. To-Be process without bottleneck

In the To-Be version of the process (Figure 17), the physical check printing bottleneck is replaced with an electronic funds transfer. The service being used for this transfer is even slower, taking 2 seconds. However, with a service such as this, an unlimited number of these tasks can run at the same time. Even though the task takes longer, the bottleneck is removed and tokens no longer queue up.

Reports to run

Dynamic Analysis - Process Comparison Analysis - Process Duration Comparison – In this report (Figure 18), you can see that the process has much better throughput without the bottleneck.

Figure 18. Process duration comparison
Figure 18. Process duration comparison

1.7 Resource shortage

One type of bottleneck in a process is a chronic shortage of resources. For example, there may be ten people working in a department. If they are not properly allocated to their tasks, it could be that one group of resources are overworked, while another group is waiting for work. Properly balancing the workload would have a positive effect on the business process.

Figure 19. As-Is process with resource shortage
Figure 19. As-Is process with resource shortage

In the process in Figure 19, the role of Payroll Admin is used for both tasks. For the As-Is process the resource pool for this role in the simulation is set for two, as Figure 20 shows:

Figure 20. Resource pool settings
Figure 20. Resource pool settings

When the simulation runs, the work is created faster than the allocated resources can process it, so there will be a resource shortage. The To-Be version of the process is exactly the same, except that the simulation runs with unlimited resources.

Reports to run

Dynamic Analysis - Aggregated Analysis - Activity Resource Allocation – In this report on the As-Is process, (Figure 21) you see the average shortage duration for the process. In the To-Be process (Figure 22), the shortage duration is 0. In general, if you see this shortage, you know that more of that particular resource needs to be allocated to have an optimal process.

Figure 21. As-Is activity resource allocation
Figure 21. As-Is activity resource allocation
Figure 22. To-Be activity resource allocation
Figure 22. To-Be activity resource allocation

Dynamic Analysis - Process Comparison Analysis - Process Duration Comparison – This report (Figure 23) shows that if you properly set the number of resources, the process would be dramatically improved.

Figure 23. Process duration comparison
Figure 23. Process duration comparison

1.8 Sequential to parallel

Some processes are made up of a series of tasks. Sometimes tasks performed sequentially could be done in parallel. For example, if two tasks do not depend on each other, they could be done by two different resources in parallel, making the overall process run faster.

Figure 24. As-Is process with sequential steps
Figure 24. As-Is process with sequential steps

The As-Is version of the process (Figure 24) has a set of three steps that are all performed by a different resource when they receive a paper document. However, these tasks do not depend on each other. For example, a new hire could be added to the badge system either before or after they are added to the phone system. If an electronic document were used instead of the current paper version, these steps could run in parallel.

Figure 25. To-Be process with parallel steps
Figure 25. To-Be process with parallel steps

In the To-Be version of the process (Figure 25), the steps are now done in parallel. The steps branch out using a fork, and come back together using a join. When using a join, all of the incoming paths must be run before the process continues on. If you had used a merge, as soon as any one of the paths was evaluated, it will continue on. You can think of a merge as an OR operation, and a join as an AND operation. In this process, all three paths must complete before the process continues. A join is what is required.

Reports to run

Dynamic Analysis - Process Comparison Analysis - Process Duration Comparison – In this report (Figure 26), you see that even with the tasks done exactly the same way, with the same cost, the process time is cut in half because the steps were done in parallel.

Figure 26. Process duration comparison
Figure 26. Process duration comparison

Process improvement patterns summary

You can use each of the individual process improvement patterns as examples to understand some of the typical things to look for to optimize your business processes. For example if you were examining an As-Is process, you could run the Activity Resource Allocation report as in section 1.7 to see if resources are not properly balanced. By looking at the process and understanding how the work is done, you might determine that a sequence of steps could be done in parallel, as in section 1.8. All of these patterns can add up to significant process improvements.


Advanced features

This section examines several advanced simulation features.

2.1 Business item instances

When running a simulation, you can use percentages on decisions to control the flow. You can also define actual data values for the business items flowing through the process and use the expression logic you define for the decisions instead. This technique lets you send a realistic set of data through a process, and view the results. This data is called a business item instance.

Figure 27. Business item settings
Figure 27. Business item settings

In the example in Figure 27, the NewEmployeeData business item is used in the context of a very simple process, shown in Figure 28:

Figure 28. Process using business item
Figure 28. Process using business item

In this process, after an automatic task verifies the data, the process flows down different paths if the person being hired is a returning employee or a new hire. For the purpose of simulation, the subprocesses are not implemented. Only a duration is set for each of them. To simulate with business item instances, you need to create the instances themselves. I created four instances, based on Table 1:

Table 1. Instance data
First NameLast NameSSNRehireStart Date

NewHire1

John

Doe

123-45-6789

No

03/06/2007

NewHire2

Jane

Doh

234-56-7890

Yes

04/01/2007

NewHire3

Jim

Dough

345-67-8901

No

03/27/2007

NewHire4

Jill

Dow

456-78-9012

No

Follow these steps to create business item instances:

  1. Right-click on the business item, and select New Business Item Instance. The editor lets you fill in the values for that particular instance, as in Figure 29:
    Figure 29. Business item instance
    Figure 29. Business item instance
  2. Before simulating, you need to define the logic for the decision. In this example (Figure 30), the Yes branch runs if the Rehire field is set to Yes. By changing from basic mode to intermediate mode or higher, you can use the expression builder to define the actual logic for the business decision. Make sure to click Apply before clicking OK to save the expression.
    Figure 30. Expression editor
    Figure 30. Expression editor
  3. After creating the simulation snapshot, click on the decision. In the general tab, set Method of selecting an output path to Based on an expression, as in Figure 31:
    Figure 31. Simulation settings general tab
    Figure 31. Simulation settings general tab
  4. To use business item instances, set the modeling mode to intermediate or higher. Click on the process background. If you don’t see a tab labeled Business Item Creation, then you are in basic mode. Switch to intermediate or advanced mode, then click the Business Item Creation tab, as Figure 32 shows.
    Figure 32. Business item creation tab
    Figure 32. Business item creation tab
  5. Select the business item to highlight it, then click create simulation values.
  6. In this example simulation scenario, the business items are divided equally across the four different business item instances. Select Weighted List as the rule for variable creation, click Add to add each one of the four business item instances to the list, and set the probabilities to 25.0% for each, as Figure 33 shows:
    Figure 33. Simulation value creation settings
    Figure 33. Simulation value creation settings
  7. Now just one task remains: the business item instance that flows into the first task. To set the output of the first task to the same value that flows in, you need to edit the business item creation settings for the task. Click on the Verify Data task in your simulation snapshot, and click the business item creation tab, as in Figure 34:
    Figure 34. Simulation settings business item creation tab
    Figure 34. Simulation settings business item creation tab
  8. Select the output, and click the Create Simulation Values button.
  9. Select the Rehire field (Figure 35) since it will be used in the decision. Under Attribute Values, click Add so you can define the expression that sets this variable.
    Figure 35. Create simulation values
    Figure 35. Create simulation values
  10. The field needs to be set based on an expression, so select the radio button for Value derived from an expression, then click Edit to open the expression editor, as in Figure 36:
    Figure 36. Enter new value dialog
    Figure 36. Enter new value dialog
  11. In the expression editor, define an expression that points to the Rehire field of the input, as in Figure 37. Select Modeling Artifact as the first term. Expand until you drill-down to the task. Drill-down into Input, and select the Rehire field. Click Apply, then click OK.
    Figure 37. Expression builder
    Figure 37. Expression builder

When the simulation runs, the data for the Rehire field gets passed through the first task so it can be used by the decision. When the simulation runs, rather than using the 50/50 percentages, it uses the actual logic for the decision. You can view the results in the Dynamic Analysis - Process Instances Summary report (Figure 38). As the sample size grows, it will more closely match the expected 75/25 ratio from the business items used in this example.

Figure 38. Process instances summary
Figure 38. Process instances summary

Business item instance simulation summary

Using business item instances in a simulation can be useful, particularly if you want to test your logic, or run a simulation based on real data. Since you can set the values for the business items, you could do things like change a field to “validated” by using a constant value, rather than an expression as used in this example.

You can import and export your business item instances using XML in Modeler. If you want to run a simulation with actual production data, you can import it from XML so that you do not have to create each of the business item instances by hand. This might be useful if you needed to use a large number of different business item instances.

2.2 Distributions

When specifying a time in a simulation, you can use a static value, which does not change, such as 5 minutes. In real life however, tasks do not always take exactly 5 minutes each time they run. Sometimes it takes 3 minutes, or sometimes 8 minutes and 47 seconds. You use a distribution to vary the times to give a more realistic result. Modeler provides a variety of distributions to more closely model real-world behavior.

In this example, there are two processes, representing a person working in a support department. The As-Is process uses a Specific Value for its task, to specify that it takes 30 minutes perform. The simulation creates tasks at the rate of one every thirty minutes. The work never has to wait for a resource, and there should never be a delay. My simulation ran 500 processes, to be sure to have a large enough sample size to be mathematically relevant. In this case the customer has one person in its support department, so the Maximum Simultaneous Tasks is set to 1.

In many tasks the time can vary widely. In a call center, for example, most calls can be answered quickly, since most incoming questions are the same. Sometimes a more difficult question arises, which causes the resource to spend a far greater amount of time solving the problem. If the process being modeled has this type of task, using the distribution would yield more accurate simulation results.

In the To-Be version of the process, I modeled the duration for the task using an Exponential distribution with a mean of 30 minutes, as Figure 39 shows:

Figure 39. Duration tab with distributions
Figure 39. Duration tab with distributions

As expected, the As-Is process never has a backup. Every 30 minutes, a task comes in. Every thirty minutes, the previous task completes. All cases take exactly 30 minutes, as Figure 40 shows. If this were your model, you would believe that the resources for the task are adequate, because work never backs up.

Figure 40. Process duration without distribution
Figure 40. Process duration without distribution

Now examine the results using the simulation with a distribution (Figure 41). The Working Duration, the time that the task was actually being worked on, is close to 30 minutes. However, the Elapsed Duration, the time including the wait in the queue before the task, is 11.5 hours! On average, the tasks took 30 minutes. However, with a distribution that closely models the real world, you can see there indeed is a significant backup of the work.

Figure 41. Process duration with distribution
Figure 41. Process duration with distribution

Distribution summary

This very simple example illustrates how modeling the duration with a distribution curve as opposed to a static specific value can cause a radically different simulation result. The case above is an extreme one, but it serves to illustrate why it is important to use distributions in simulations when appropriate.

2.3 Loops

Several tasks are required to create a loop that simulates properly, as well as export to WS-BPEL. For a summary, see Using Loops in WebSphere Business Modeler v6 to improve simulations and export to BPEL (developerWorks, March 2007)

Figure 42 shows the As-Is process, with a backwards-flowing connection. Figure 42 shows the To-Be process, correctly modeled with a loop that can be correctly exported to WS-BPEL.

Figure 42. Process with backwards flowing connection
Figure 42. Process with backwards flowing connection
Figure 43. Process with loop
Figure 43. Process with loop

If you use a connection that flows backwards instead of a loop the simulation runs correctly, but you cannot export the process to WS-BPEL. It is a best practice to avoid backwards-flowing connectors.


Complete processes

This section provides complete processes, rather than just the process fragments used in the previous sections.

3.1 Process development

Modeler is a key tool used in the business driven development (BDD) cycle. The question could be asked: if you start with Modeler rather than using a traditional development process, is there really a time and cost savings? Modeler’s simulation capabilities can answer these questions.

In this section, I model and simulate a traditional development process. The As-Is version of the process uses a drawing tool to document the process, rather than beginning with a model. Because a picture is not adequate to capture enough information about the process for I/T to use in development, it also uses a document processor. In this example case study, the drawing tool is called Whiz-IO, while the document processor is called Wurt. After the process is documented, I/T begins building the process using WebSphere Integration Developer.

Figure 44. Partial process development model
Figure 44. Partial process development model

In the To-Be version of the process (Figure 44), we replace the drawing tool and document processor with Modeler. I/T adds the technical attributes to the model and then exports it, rather than creating everything by hand. Figure 44 only shows the first part of the process, for the full process please see the download zip file.

In both models, you’ll notice a rework loop where the business analyst and I/T go back and forth until they come to an agreement on the business contract for what to develop. In the To-Be process with Modeler, this rework loop has less iterations on average because models are precise in their representation, whereas pictures can be ambiguous.

Reports to run

Dynamic Analysis - Process Comparison Analysis - Process Duration Comparison. In this report (Figure 45) you can clearly see that even with conservative estimates, the time is cut by 20%.

Figure 45. Process duration comparison
Figure 45. Process duration comparison

Dynamic Analysis - Process Comparison Analysis - Process Cost Comparison. In this report (Figure 46) you can see that the resource costs are over 30% lower for the To-Be version of the process.

Figure 46. Process cost comparison
Figure 46. Process cost comparison

Process development conclusions

The simulation clearly shows that using business driven development saves time as well as money when compared to a traditional development cycle.

3.2 Loan approval

The As-Is loan approval process for this section contains many of the process flaws found in the process improvement patterns section. Rather than looking at just process fragments, this section lets you look at a full-blown process. The To-Be version has the suggested process improvements in place. You can compare the two processes to help justify the cost of process automation.

Figure 47. Loan approval process part 1
Figure 47. Loan approval process part 1

The As-Is process (Figure 47) is paper based. A paper loan application first goes to a data validator who looks up the customer information in a legacy computer system, and compares it to what is on the paper application. If needed the validator contacts the customer and updates the system. This is a good opportunity to apply the pattern of using an automatic service to replace the manual work. A person should only be brought in when there’s a discrepancy in the data. The expensive human resource should only be used where they add business value. The To-Be version of the process contains these improvements, using an electronic version of the loan application request.

Continuing in the process (Figure 47), the loan processing department performs three steps. They pull the customer history from a computer system, print it, and add it to the folder. They also determine the customer credit score by contacting two credit bureaus. Once those steps are completed, an in-house business rules inference engine calculates the loan risk. The first two tasks do not depend on each other, which means they are ideal candidates for applying the pattern of serial to parallel work. All three tasks could also be automated. For example, rather than entering the loan information into the business rules inference engine and then writing down the result, a service could perform all this work, passing data from the process.

Figure 48. Loan approval process part 2
Figure 48. Loan approval process part 2

After assessing the loan for risk, the next step is to pull 5% of the loan requests for auditing, as Figure 48 shows. The auditor rechecks the validity of data, making sure mistakes were not made. They also re-run the risk assessment to make sure that it too is accurate. Knowing that they may be audited helps keep the employees of the loan processing department honest, as well as reducing mistakes. If the tasks were automated instead of manual, it would eliminate inaccurate data, human mistakes, and the need for an audit. These steps become unnecessary in the To-Be version of the process, as Figure 49 shows:

Figure 49. Loan approval process part 3
Figure 49. Loan approval process part 3

The bank does not accept loans of a high risk because it is a very conservative financial institution. They refer high-risk loans to a sub-prime lender who specializes in risky loans. The bank must first contact the customer to get permission to pass their information along to the other lender. A person from the loan processing department contacts the customer. If the customer approves, the loan processing department inputs their information into a Web application for the sub-prime lender. The resource from loan processing then files away the paperwork and updates the customer history. In the To-Be version of the process, the customer still needs to be contacted, but the update to the other lender could be done electronically, saving time and money.

Figure 50. Loan approval process part 4
Figure 50. Loan approval process part 4

If the loan is determined to have a medium or low risk, the paper folder is sent on to a loan officer who makes the final approval determination. They need to search through all of the documents to find the information needed to make their decision. In the To-Be process with workflow applied, we consolidate the information from the loan record onto one screen so they can have everything they need to make their decision right at their fingertips. This will save time and reduce errors.

If the loan is approved, it is sent on to the loan completion department. If it is rejected, customer service contacts the customer and loan processing does the paperwork to close out the application, update the customer history system, and file away the paperwork for archiving. These manual tasks are automated in the To-Be version of the process.

Simulating the processes

I simulated both the As-Is and To-Be processes using 500 process instances with a token creation rate of one every five minutes. The data in Table 2 were used for the number of resources:

Table 2. Numbers of resources for simulation
As-IsTo-Be

Data Validators

10

0

Loan Processing

20

10

Loan Officer

5

5

Auditor

2

0

Customer Service

20

20

The number of resources applied to the To-Be process is lower due to the process automation techniques we applied.

Reports to run

Dynamic Analysis - Process Comparison Analysis - Process Duration Comparison In this report (Figure 51) you can see that eliminating paper from the process along with the other improvements makes a dramatic difference. Instead of days to complete, the work can now be done in hours.

Figure 51. Process duration comparison
Figure 51. Process duration comparison

Dynamic Analysis - Process Comparison Analysis - Process Cost Comparison. This report (Figure 52) shows a much lower cost for the To-Be version of the process. We reduced resource costs by over 93%! The cost reduction shows that the project would pay for itself in just a few months.

Figure 52. Process cost comparison
Figure 52. Process cost comparison

Dynamic Analysis - Aggregated Analysis - Activity Resource Allocation. The analysis of the As-Is process in Figure 53 clearly shows that the work is backing up due to a shortage of resources. All of the roles other than loan completion have a shortage. Additional resources would need to be added to all of the other roles to have an efficient process. When comparing the cost of adding additional people, to the cost of automating the process, it is quite clear that process automation would be a better long-term solution.

Figure 53. As-Is process activity resource allocation
Figure 53. As-Is process activity resource allocation

By contrast, the To-Be process in Figure 54 shows only one resource that acts as a bottleneck: the loan officer. Because most of the tasks are automated, there are little or no constraints on the number of them that can be performed in parallel. The bank may decide that performance is adequate even with the small bottleneck. As they grow, they know that might need to add more loan officers to keep the process as efficient as possible.

Figure 54. To-Be process activity resource allocation
Figure 54. To-Be process activity resource allocation

Dynamic Analysis - Process Cases Analysis - Cases Summary. When running this report on the As-Is process (Figure 55), it is clear that there are several high-costs paths (cases 8-10). The To-Be process would have a goal to avoid these paths if possible, or at least reduce the number of times that this path is followed.

Figure 55. As-is process cases summary
Figure 55. As-is process cases summary

The To-Be process cases summary in Figure 56 shows more possible paths, but none of them approach the cost of the ones in the As-I process. Case 14 took longer than one day, but looking at the details shows that it was run 1 time out of 500. With such a low percentage of use, it would not be cost effective to spend time attempting to further improve this path.

Figure 56. To-Be process cases summary
Figure 56. To-Be process cases summary

Loan approval conclusions

This example lets you apply many of the process improvement patterns from the previous sections in the context of a larger business process. Each individual improvement can have a positive effect, but when taken in sum, they can result in dramatic results.


Conclusion

This article showed how to use WebSphere Business Modeler simulations to improve a business process. We studied advanced simulation features and examined full process examples to show how you could apply the different process improvement patterns to a full business process.


Download

DescriptionNameSize
Demo samplesimulation-demo.mar4.2 MB

Resources

Learn

Get products and technologies

Comments

developerWorks: Sign in

Required fields are indicated with an asterisk (*).


Need an IBM ID?
Forgot your IBM ID?


Forgot your password?
Change your password

By clicking Submit, you agree to the developerWorks terms of use.

 


The first time you sign into developerWorks, a profile is created for you. Information in your profile (your name, country/region, and company name) is displayed to the public and will accompany any content you post, unless you opt to hide your company name. You may update your IBM account at any time.

All information submitted is secure.

Choose your display name



The first time you sign in to developerWorks, a profile is created for you, so you need to choose a display name. Your display name accompanies the content you post on developerWorks.

Please choose a display name between 3-31 characters. Your display name must be unique in the developerWorks community and should not be your email address for privacy reasons.

Required fields are indicated with an asterisk (*).

(Must be between 3 – 31 characters.)

By clicking Submit, you agree to the developerWorks terms of use.

 


All information submitted is secure.

Dig deeper into Business process management on developerWorks


static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Business process management, WebSphere, Architecture
ArticleID=249980
ArticleTitle=WebSphere Business Modeler Advanced Simulation
publish-date=08222007