AE91_SHINJI_KANAI 110000AE91 Visits (10475)
Technote 1973531 - This technote provides all you need to automate the deploy and build process of source code generated by IBM
AcdntlPoet 2700019V2G Visits (8793)
If you are planning upgrades to any of your IBM products in 2016, you will want to review the Support Lifecycle page first to determine if the product versions currently installed have reached (or are nearing) their end of support. There are a number of product versions that are scheduled for End of Support in 2016.
The End of support (EOS) date listed is the last date on which IBM will deliver standard support services for a given version/release of a product. This does not include any extended support contracts you may have with IBM, so please refer to your specific documentation if your agreements go beyond standard support.
For a deeper understanding and navigation tips, watch the IBME
AcdntlPoet 2700019V2G Visits (8389)
Setting up TRIRIGA Notifications with Office 365 - If you are using Office 365 as part of your Reserve integration, then you probably want to have notifications set up as well. Kenny To has posted a quick and easy solution in the Real Estate and Facilities Management blog here.
Learning the IBM TRIRIGA 10.3 Porfolio is a must - In a continuation to his previous blog post (Being a New TRIRIGA Engineer) Mark Raymund Dunham writes about the importance of the IBM TRIRIGA 10.3 Porfolio Self-Paced Virtual Class here.
AcdntlPoet 2700019V2G Visits (10186)
Welcome to the Cognitive Era - A new era of technology. A new era of business. A new era of thinking. A Cognitive Business is a business that thinks. A new era of technology is giving rise to a new era of business. Digital is not the destination but the foundation for a new era of business; we call it cognitive business, and IBM Watson is the platform. Today Watson is helping doctors re-imagine medicine, and leaders reshape industries as diverse as retail, banking and travel. And Watson is taught by industry experts, so their know-how can reach more practitioners.
AcdntlPoet 2700019V2G Visits (10868)
Migrating data from IBM Rational DOORS to IBM Rational DOORS Next Generation- With the release of IBM Rational DOORS 188.8.131.52 and IBM Rational DOORS Next Generation V6.0.1 the migration has been optimized, now automatically creating a global type system and artifact types in IBM Rational DOORS Next Generation, considering the commonalities of the type system defined in modules within DOORS, as well as links relationships. In addition, you can now incrementally migrate a project, and the two applications will automatically maintain and update link relationships with every incremental migration.
Migration is not interchange of data where data can go from one tool to another and possibly back again. Migration is a one-way move from Rational DOORS to Rational DOORS Next generation, with traceability back to the source.
Using a migration package, migration can include one or more modules and can be incremental, migrating a little at a time, and only what you need. The current version of modules is migrated along with internal external and OSLC links, OLE objects, pictures and so forth.
The history of the migrated data is not migrated however, the migration creates links in Rational DOORS Next Generation that link back to the corresponding records in Rational DOORS.
AcdntlPoet 2700019V2G Visits (10386)
Migrate data from Rational DOORS to Rational DOORS Next Generation- Kim Letkeman, Senior Technical Staff Member, IBM and Martin Henderson, Development Manager, IBM, lead you through planning and implementing a DOORS migration project.
Plan and execute a migration project from IBM® Rational® DOORS® to IBM Rational DOORS Next Generation. This article provides a list of migration terms and definitions (as opposed to interchange) and describes the phases and tasks involved when moving active data into a Collaborative Lifecycle Management environment.
AcdntlPoet 2700019V2G Visits (9471)
Maximo 7.5 - Configuring AUTOKEY: How to set up AUTOKEY to auto number a field by Leandro Garcia
Thousands Workflow Events in Tririga queued up by Data Integrator or Web Services will prevent regular user & system Workflows from executing
Fabio L Pinto 270003DRX7 Visits (7613)
If you have developed an interface submitting thousands of Workflow Events to be executed by your process server, they will likely create a huge queue to be processed leading following required and essential user & system Workflows to get queued up as well, waiting for those Events to be processed. At this point your system will get stuck, with sessions waiting for required Workflows to run.
Ideally you should be submitting such Workflow events in "chunks" or small batches so that system is not impacted with lots of Workflows queued up waiting for processing to finish.
If it is too late and you have submitted those thousands of records already, this may take a consider amount of time to process; hours or even days depending on the quantity, complexity and system resources available.
The current count of Workflow Events can be confirmed by checking "IBM TRIRIGA Admin Console" -> "WorkFlow Events" managed object page. You may have an idea of how much time all those queued up Workflow events and the recently added ones (user & system) will take to process by checking that regularly and taking notes of how many records have been moved out from the queue (number of Events queued up, trend).
For managing this situation properly, review the following actions:
A01) Make sure you do have
A02) Make sure you only have one Workflow agent running with open filter (no filter, no list of users). Having two or more Workflow agents running with no filter criteria will slow down process likely since they might be competing for the same resources and records. See more information on our IBM TRIRIGA Wiki page "Whe
If adjusting your system to those recommendations does not help, you may try the following alternative way for handling this situation.
***NOTE! The following procedure is NOT a supported process. This document is intended to assist clients find a workable solution when they have not followed best practices and the system function properly. The steps below are presented as an option but may present a risk if not executed correctly;
AL01) First, you need to get familiar with how Workflow Events work, so please review our IBM TRIRIGA Wiki page "Wor
AL02) Make sure you do have a good backup of your database in place. It is strongly recommend you try the following steps on a lower environment first (testing, sand-box, development).
AL03) You need to determine the criteria for selecting the Workflow records you are bringing from your Interface, so that you can separate them from the user & system required ones. Once you have this information you are able to proceed;
AL04) Review, adapt, and follow the instructions below:
1) Stop the Workflow agent (IBM TRIRIGA Admin Console -> Agent Manager);
2) Create a table wf_event_backup as the current wf_event table. Truncate the wf_event after that (Make sure you do have a good database backup in place);
drop table wf_event_backup;
4) Insert into wf_event table selecting the workflows for the users that are not involved in your interface process, delete those out of the wf_event_backup table once inserted and processed;
Example given (note, you need to user your criteria here, you need to adapt and replace the where clauses below);
insert into wf_event select * from wf_event_backup where user_id <> [user-id];
... or ...
insert into wf_event select * from wf_event_backup where event_id not like '%Associate%';
5) Once those Workflows above have processed, insert 500 records at a time into the wf_event table, from the wf_event_backup table based on row_number (where row_number < 500). See that now you will be working with the Workflow Events coming from your interface Delete from wf_event_backup where row_number < 500.
Example given (for Oracle):
insert into wf_event select top(500) * from wf_event_backup where event_id like '%Associate%' ;
6) Repeat step 5 above until all of those records have been processed (the ones coming from Interface).
Fabio L Pinto 270003DRX7 Visits (9611)
When running IBM TRIRIGA Platform Installer, you may turn on LAX_DEBUG parameter for installer to run in DEBUG mode:
LAX_DEBUG=true <installer command line>
a) LAX_DEBUG, it is the parameter per si;
For Linux/Unix, use bash or sh shell for executing the installer using LAX_DEBUG.
For Windows, use command prompt / shell and make sure you use "Run As Administrator" right-click option when executing, so that administrator security rights is correctly set to the session.
The extra DEBUG log lines are printed out to the console, the ant.log isn't impacted. Copy and paste the console output lines as text so that you can better check the installer tracing information. It may be really useful when troubleshooting IBM TRIRIGA installer runs and this is part of the required information IBM TRIRIGA support would request for these cases.
What is the Tririga WF started for Schedule Event generated by a Payment Reconciliation from a Lease with Audit Clause?
Fabio L Pinto 270003DRX7 Visits (11535)
The Tririga workflow fired is "tri
This is how it works:
a) When a lease is activated, This workflow "triLeaseClause - Synchronous - Create Audit Service Included from Selected" is fired on Lease Clause to create Payment Audit Setup record.
b) Then the "tri
on scheduled start dates.
If you see no WorkFlow being started for the Schedule Events created for the Payment Reconciliation on Leases with Audit Clauses and you have Microsoft SQL Server in place, check if you have the following fix included on your IBM TRIRIGA Platform version:
APAR #: IV76293
If you are still seeing issues with this process and you can be reproduced on a lower environment (testing, sand-box, QA), it is good temporary set Workflow Instance Recording on this lower environment for tracing the Workflows & actions fired for the lease record, and check the flow and warning/error messages issued during the process. But see that using Workflow Instance Recording can cause slow downs and performance issues all over system, so it needs to be used only for lower environments (it should never be used for Production environments!) for temporary tracing and debugging workflows, meaning this needs to be changed from "Always" to "Errors Only" as soon as you are done with your analysis.
For more information on using Workflow Instance Recording, kindly review our
AcdntlPoet 2700019V2G Visits (9998)
How to index DOORS 9.6 artifacts in Lifecycle Query Engine - John Carolan demonstrates the steps required to enable DOORS 9.6 as a Tracked Resource Set (TRS) provider and how to add that feed to Lifecycle Query Engine (LQE) as a new Data Source.
Fabio L Pinto 270003DRX7 Visits (10892)
a) triPeople - triRetire - Remove TRIRIGA User and Read Only Dependant Records
b) triPeople - Synchronous - Remove TRIRIGA User My Profile
This move the record to Retired state, meaning they are still retained in the system. The only transitioning able to remove them from the database is setting them to NULL.
Each People BO record occupies 50 KB in average, so if you are performing massive deletion, for instance, deleting 100,000 records, this means about 5 GB being processed and worked by triRetire process at that time.
Using triRetire is the only supported process for archiving triPeople BO records.
If there is need to perform a massive retire process in system, Data Integrator may not be a good choice. Using WebServices will be a better option, but this could be enhanced to look at the number of workflows in the queue by looking at the "monitor.jsp" - Monitor a single value.
The web service code would parse and check for the numeric value returned from a URL like http
If the value is over a number (start with 9000 for example) then it would pause the integration and wait for a while until the queue is halved (4500 for example).
See that there isn't a direct way to call a workflows using WebServices. You would cause it to be executed for a given record by performing the action(transition) on the record that the workflow it tied to. For instance, if you have a workflow tied to an 'Activate' action, then using the WebService to activate the record will cause the required workflows to execute.
More information about IBM TRIRIGA WebServices can be found on our
"You do not have permission to access this page" error message when the requester tries to open a service request record sent back from the approver for clarification
Fabio L Pinto 270003DRX7 Visits (10176)
If you have a Tririga user creating successfully a service request record that goes to approval workflow, but the approver sends it back requesting clarification, you may receive a error message saying "You do not have permission to access this page. Please, contact your TRIRIGA administrator. Thank you.".
This may happen when that user only has "TRIRIGA Request Central" license assigned to him/her.
If this is the case and Security Group setup allows that, the user can successfully create the service request record that will go through the approval workflow, and that license will allow action Notification. But if the record is sent back by the approver for clarification this will require access to action Item Record Type (Wor
Here it follows the list of the IBM TRIRIGA Licenses providing access to the action Item Record Type WorkFlowActionItem:
IBM TRIRIGA Facility Management Enterprise
The solution for that error will be adding to the requester user any one of the licenses above so that it is possible to process the WorkFlowActionItem action properly, as per design. The rest of security will rely on the Security Group setup for that user, and so you can restrict access to any BO or Form as per your business needs & requirements.
AcdntlPoet 2700019V2G Visits (10388)
IBM IoT Real-Time Insights – Analytics Designed for the Internet of Things: As the Internet of Things (IoT) expands rapidly, more and more “things” are reporting their properties, location, and status in near real-time. This generates a huge volume and variety of data that is under-utilized…or often not used at all! Enterprises can leverage this data to understand the state of operations and equipment to better run their businesses. The key to achieving that efficiency is to utilize IoT data effectively to drive business decisions and results... [Read more...]
IoT Real-Time Insights integrates IFTTT and Node-RED: Real-time analytics provide insights from streaming IoT data, but the key is taking the appropriate action as a result of those insights, and IoT Real-Time Insights helps you do both. Recently, we made some significant updates to the service that dramatically improve the insights (improved analytics capabilities) and the available actions allowing you to... [Read more...]
IoT Real-Time Insights consumes the data and device information, enriches that data with asset master records and weather data, and applies rules to take action when conditions warrant enabling you to gain awareness of equipment and operations to make better decisions, improve availability, and respond more quickly to emerging conditions.