Thousands Workflow Events in Tririga queued up by Data Integrator or Web Services will prevent regular user & system Workflows from executing
Fabio L Pinto 270003DRX7 Visits (7730)
If you have developed an interface submitting thousands of Workflow Events to be executed by your process server, they will likely create a huge queue to be processed leading following required and essential user & system Workflows to get queued up as well, waiting for those Events to be processed. At this point your system will get stuck, with sessions waiting for required Workflows to run.
Ideally you should be submitting such Workflow events in "chunks" or small batches so that system is not impacted with lots of Workflows queued up waiting for processing to finish.
If it is too late and you have submitted those thousands of records already, this may take a consider amount of time to process; hours or even days depending on the quantity, complexity and system resources available.
The current count of Workflow Events can be confirmed by checking "IBM TRIRIGA Admin Console" -> "WorkFlow Events" managed object page. You may have an idea of how much time all those queued up Workflow events and the recently added ones (user & system) will take to process by checking that regularly and taking notes of how many records have been moved out from the queue (number of Events queued up, trend).
For managing this situation properly, review the following actions:
A01) Make sure you do have
A02) Make sure you only have one Workflow agent running with open filter (no filter, no list of users). Having two or more Workflow agents running with no filter criteria will slow down process likely since they might be competing for the same resources and records. See more information on our IBM TRIRIGA Wiki page "Whe
If adjusting your system to those recommendations does not help, you may try the following alternative way for handling this situation.
***NOTE! The following procedure is NOT a supported process. This document is intended to assist clients find a workable solution when they have not followed best practices and the system function properly. The steps below are presented as an option but may present a risk if not executed correctly;
AL01) First, you need to get familiar with how Workflow Events work, so please review our IBM TRIRIGA Wiki page "Wor
AL02) Make sure you do have a good backup of your database in place. It is strongly recommend you try the following steps on a lower environment first (testing, sand-box, development).
AL03) You need to determine the criteria for selecting the Workflow records you are bringing from your Interface, so that you can separate them from the user & system required ones. Once you have this information you are able to proceed;
AL04) Review, adapt, and follow the instructions below:
1) Stop the Workflow agent (IBM TRIRIGA Admin Console -> Agent Manager);
2) Create a table wf_event_backup as the current wf_event table. Truncate the wf_event after that (Make sure you do have a good database backup in place);
drop table wf_event_backup;
4) Insert into wf_event table selecting the workflows for the users that are not involved in your interface process, delete those out of the wf_event_backup table once inserted and processed;
Example given (note, you need to user your criteria here, you need to adapt and replace the where clauses below);
insert into wf_event select * from wf_event_backup where user_id <> [user-id];
... or ...
insert into wf_event select * from wf_event_backup where event_id not like '%Associate%';
5) Once those Workflows above have processed, insert 500 records at a time into the wf_event table, from the wf_event_backup table based on row_number (where row_number < 500). See that now you will be working with the Workflow Events coming from your interface Delete from wf_event_backup where row_number < 500.
Example given (for Oracle):
insert into wf_event select top(500) * from wf_event_backup where event_id like '%Associate%' ;
6) Repeat step 5 above until all of those records have been processed (the ones coming from Interface).
Fabio L Pinto 270003DRX7 Visits (9734)
When running IBM TRIRIGA Platform Installer, you may turn on LAX_DEBUG parameter for installer to run in DEBUG mode:
LAX_DEBUG=true <installer command line>
a) LAX_DEBUG, it is the parameter per si;
For Linux/Unix, use bash or sh shell for executing the installer using LAX_DEBUG.
For Windows, use command prompt / shell and make sure you use "Run As Administrator" right-click option when executing, so that administrator security rights is correctly set to the session.
The extra DEBUG log lines are printed out to the console, the ant.log isn't impacted. Copy and paste the console output lines as text so that you can better check the installer tracing information. It may be really useful when troubleshooting IBM TRIRIGA installer runs and this is part of the required information IBM TRIRIGA support would request for these cases.
What is the Tririga WF started for Schedule Event generated by a Payment Reconciliation from a Lease with Audit Clause?
Fabio L Pinto 270003DRX7 Visits (11678)
The Tririga workflow fired is "tri
This is how it works:
a) When a lease is activated, This workflow "triLeaseClause - Synchronous - Create Audit Service Included from Selected" is fired on Lease Clause to create Payment Audit Setup record.
b) Then the "tri
on scheduled start dates.
If you see no WorkFlow being started for the Schedule Events created for the Payment Reconciliation on Leases with Audit Clauses and you have Microsoft SQL Server in place, check if you have the following fix included on your IBM TRIRIGA Platform version:
APAR #: IV76293
If you are still seeing issues with this process and you can be reproduced on a lower environment (testing, sand-box, QA), it is good temporary set Workflow Instance Recording on this lower environment for tracing the Workflows & actions fired for the lease record, and check the flow and warning/error messages issued during the process. But see that using Workflow Instance Recording can cause slow downs and performance issues all over system, so it needs to be used only for lower environments (it should never be used for Production environments!) for temporary tracing and debugging workflows, meaning this needs to be changed from "Always" to "Errors Only" as soon as you are done with your analysis.
For more information on using Workflow Instance Recording, kindly review our
AcdntlPoet 2700019V2G Visits (10167)
How to index DOORS 9.6 artifacts in Lifecycle Query Engine - John Carolan demonstrates the steps required to enable DOORS 9.6 as a Tracked Resource Set (TRS) provider and how to add that feed to Lifecycle Query Engine (LQE) as a new Data Source.
Fabio L Pinto 270003DRX7 Visits (11056)
a) triPeople - triRetire - Remove TRIRIGA User and Read Only Dependant Records
b) triPeople - Synchronous - Remove TRIRIGA User My Profile
This move the record to Retired state, meaning they are still retained in the system. The only transitioning able to remove them from the database is setting them to NULL.
Each People BO record occupies 50 KB in average, so if you are performing massive deletion, for instance, deleting 100,000 records, this means about 5 GB being processed and worked by triRetire process at that time.
Using triRetire is the only supported process for archiving triPeople BO records.
If there is need to perform a massive retire process in system, Data Integrator may not be a good choice. Using WebServices will be a better option, but this could be enhanced to look at the number of workflows in the queue by looking at the "monitor.jsp" - Monitor a single value.
The web service code would parse and check for the numeric value returned from a URL like http
If the value is over a number (start with 9000 for example) then it would pause the integration and wait for a while until the queue is halved (4500 for example).
See that there isn't a direct way to call a workflows using WebServices. You would cause it to be executed for a given record by performing the action(transition) on the record that the workflow it tied to. For instance, if you have a workflow tied to an 'Activate' action, then using the WebService to activate the record will cause the required workflows to execute.
More information about IBM TRIRIGA WebServices can be found on our
"You do not have permission to access this page" error message when the requester tries to open a service request record sent back from the approver for clarification
Fabio L Pinto 270003DRX7 Visits (10331)
If you have a Tririga user creating successfully a service request record that goes to approval workflow, but the approver sends it back requesting clarification, you may receive a error message saying "You do not have permission to access this page. Please, contact your TRIRIGA administrator. Thank you.".
This may happen when that user only has "TRIRIGA Request Central" license assigned to him/her.
If this is the case and Security Group setup allows that, the user can successfully create the service request record that will go through the approval workflow, and that license will allow action Notification. But if the record is sent back by the approver for clarification this will require access to action Item Record Type (Wor
Here it follows the list of the IBM TRIRIGA Licenses providing access to the action Item Record Type WorkFlowActionItem:
IBM TRIRIGA Facility Management Enterprise
The solution for that error will be adding to the requester user any one of the licenses above so that it is possible to process the WorkFlowActionItem action properly, as per design. The rest of security will rely on the Security Group setup for that user, and so you can restrict access to any BO or Form as per your business needs & requirements.
AcdntlPoet 2700019V2G Visits (10574)
IBM IoT Real-Time Insights – Analytics Designed for the Internet of Things: As the Internet of Things (IoT) expands rapidly, more and more “things” are reporting their properties, location, and status in near real-time. This generates a huge volume and variety of data that is under-utilized…or often not used at all! Enterprises can leverage this data to understand the state of operations and equipment to better run their businesses. The key to achieving that efficiency is to utilize IoT data effectively to drive business decisions and results... [Read more...]
IoT Real-Time Insights integrates IFTTT and Node-RED: Real-time analytics provide insights from streaming IoT data, but the key is taking the appropriate action as a result of those insights, and IoT Real-Time Insights helps you do both. Recently, we made some significant updates to the service that dramatically improve the insights (improved analytics capabilities) and the available actions allowing you to... [Read more...]
IoT Real-Time Insights consumes the data and device information, enriches that data with asset master records and weather data, and applies rules to take action when conditions warrant enabling you to gain awareness of equipment and operations to make better decisions, improve availability, and respond more quickly to emerging conditions.
AcdntlPoet 2700019V2G Visits (10058)
Maximo - Verifying deployed XMLs using the WebSphere Console: How to confirm deployments in the WebSphere Console related to integrations By Eugene Bonks
AcdntlPoet 2700019V2G Visits (10373)
5 Things to Know about API Management in Bluemix- There is a lot of buzz around the API Economy. The API Economy is where a company, the provider company, decides to expose their core business logic in the form of APIs that third parties, the consumers, can consume and build applications that unlock... [Read on for the 5 things...]
AcdntlPoet 2700019V2G Visits (10607)
Maximo 7.6 My Recent Applications - A new feature in Maximo 7.6 - My Recent Applications demonstrated and explained by Patrick Nolan
AcdntlPoet 2700019V2G Visits (9781)
Maximo 7.6 - Updating data in Maxdemo for use with Scheduler: Updating the data provided in Maxdemo so it can be used by Scheduler by Dick Chertow.
AcdntlPoet 2700019V2G Visits (9714)
Engineering - Putting IoT to Work for Me: Bret Greenstein Vice President, Rational Continuous Engineering Solutions IBM Software Group discusses how he put IoT to use for him to solve a simple problem: he wanted to turn off his lights from bed but was unable to reach the switch to do so... enter the Internet of Things!
AcdntlPoet 2700019V2G Visits (9602)
Link Validity — Coming in CLM version 6.0.1: As part of the Rational solution for Collaborate Lifecycle Management (CLM) version 6.0.1 we are introducing Link Validity, a new feature in DOORS Next Generation, Quality Manager, and Design Manager. In projects that have configuration management enabled, Link Validity can take advantage of the features that multi-stream development brings to the table.
AcdntlPoet 2700019V2G Visits (10473)
Robin notes about the tutorial creator: "Yianna Papadakis-Kantos authored another great tutorial for DOORS Next Generation. The tutorials she writes are always educational and helpful. You'd expect nothing less from a curriculum architect and instructional designer."
Take control of your requirements projects with Configuration Management- Author: Yianna Papadakis-Kantos
Get hands-on experience with IBM® Rational® DOORS® Next Generation and the configuration management capabilities it supports using the exercises in this tutorial and accessing DOORS Next Generation in the sandbox.
See the other top two tutorials here!
AcdntlPoet 2700019V2G Visits (6989)
For more information on this new feature add, head over to the IoT Developers blog post: Devices can now send events over HTTP to IoT Foundation
If you want to try out this support but haven’t signed up and created an organization then you can use the Quickstart service. Quickstart supports HTTP without security to allow you to quickly try things out. See the full documentation for HTTP/S support.
Device Management is now live in IoT Foundation as well! Check out this informative blog post to learn more about major new enhancement we’ve just made to the Internet of Things Foundation service. We call it Device Management, and it’s all about making it easy and efficient to manage lots of IoT devices.