Romain_Barth 2700076HKB Visits (6627)
Is there a way to delete multiple linksets from a link module including all the contained links?
The DOORS UI does not provide that feature but it is possible by using DXL.
Here is a sample that will delete all the linksets and links contained in a specific project:
First, open your link module in Exclusive Edit mode, then run the script above.
There is a potential issue with the shutdown process for TRIRIGA. This is specific to versions prior to TRIRIGA 3.5.1 running under Windows with Websphere Liberty Profile (WLP). We have found that the SHUTDOWN.BAT file may not end the session. If you encounter this issue you can resolve it by adding the path as described below.
In TRIRIGA 3.5.1 for both the RUN.BAT and SHUTDOWN.BAT we include a step to change directory to the location of the server.bat file as below.
(Note: In notepad there are no spaces or carriage returns between the rows.)
@echo offset JAVA
It looks like this in other viewers
echo JAVA_HOME : %JAVA_HOME%
echo CLASSPATH : %CLASSPATH%
cd /d /hom
server.bat start tririgaServer
cd /d /hom
server.bat stop tririgaServer
However, in previous versions that path is not included:
TRIRIGA 3.4.2 (shutdown.bat)
server.bat stop tririgaServer
TRIRIGA 3.5 (shutdown.bat)
server.bat stop tririgaServer
If you encounter an issue where the shutdown.bat file is not working you can resolve the issue by adding the path to the server.bat file similar to the example below.
cd /d /your_path_to tririga/wlp/bin
server.bat stop tririgaServer
JohnONeill 270004JPGF Visits (6056)
We often get questions regarding our plans for End of Support (EOS) for TRIRIGA versions. I thought I’d share some general information and a couple of useful links that will help customers and Business Partners with planning their future activities.
We typically announce an EOS date 5 years after the Release. Please note that this is not a guarantee, commitment or a promise. It is a guideline that we try to follow. There are sometimes exceptions that require retirement of a version before the originally expected date. This can occur due to technology changes outside of our control, security issues or other reasons. When this happens we make that announcement as soon as possible.
Please see the TRIRIGA Supported Versions information at the following link for the latest information.
For general convenience here is the link to the IBM Software Support Lifecycle site. Here you can search for similar information for any IBM product.
First off, some background. I have been using and troubleshooting Data Integrator related issues for several years. It's a good tool, and quite handy for importing and updating data in the IBM TRIRIGA application. That being said, Support has received quite a bit of traffic with regard to using this tool and so I thought I would provide my take and open a dialogue for commentary.
The most pervasive issue I have seen overall has been problems with the source sheet. Typically, I start my Data Integrator process by building a spreadsheet with the column headers using the TRIRIGA application. Starting from the Data Integrator interface, reached from Home > Tools > Data Integrator interface using the Create Header action you can generate a base sheet. It's as easy as selecting the fields you want to use and exporting a sheet to begin working with. You also have the option to simply open Excel and type in the fields you want to use.
A known issue is encountered when copying and pasting from the application into Excel. In fact, this is one of the key points I want to make. Copying data from the TRIRIGA application or from any other tool, into Excel, will almost invariably introduce formatting into the spreadsheet. This is the most common cause of issues with the upload process. HTML formatting information will cause problems with the upload.
The method I have found that aids in getting around this restriction is to use the Copy/Paste VALUES option when pasting data into the spreadsheet. This removes the formatting tags, and allows for a clean upload. At times, I have experimented with copying and pasting an entire spreadsheet into a new sheet, again using the VALUES option, to clean up an upload sheet prior to saving as a text file. This yields good results and has solved many issues for me.
Another area where I have encountered issues in the past is when trying to make edits to the text file after exporting it from Excel. I strongly recommend that this NOT be done. If any edits are warranted, please make the changes in the spreadsheet and re-export the text file. In fact, I would recommend deleting the original text file and doing a fresh export each time any edits are made. This eliminates the possibility of bringing in bad data, or merging unexpected edits.
Source data can introduce issues as well. Flat file outputs from third-party software may not contain all of the data need for either upload or update. Missing column data can cause missing rows or mismatched data post-upload. Also, checking and verifying the input data can be tedious. If there is any doubt about the source data, I would recommend using one of the other integration methods where you can engage workflow to trap erroneous or missing data.
Because Data Integrator uses a flat file transfer methodology, there is very little in the way of error trapping involved, and there are no real trigger points for validation workflows to check the data prior to saving the data. If your source data is in doubt in any way, I have to recommend using Business Connect or Data Connect to bring your data in.
I welcome any comments or suggestions for extending this post, and Happy Integrating!
SUPPORT NOTIFICATION for non-browser TRIRIGA clients such as CAD Integrator, BIM, and Microsoft Outlook add-in
JeffLong 270005B0Q4 Visits (10861)
IBM TRIRIGA does not support SAML (Security Assertion Markup Language) or credential-less login mechanisms such as SmartCard or CAC (Common Access Card) as a method of authentication for its non-browser clients such as CAD Integrator, BIM, and the Microsoft Outlook add-in.
SSO solutions need to provide a mechanism for basic authentication as per the documentation in the "Requirements for single sign-on requests in the TRIRIGA Application Platform" for non-browser clients. SAML and SmartCard or CAC do not support basic authentication for non-browser based clients.
The best practice if using SAML or SmartCard/CAC, is to authenticate directly to Tririga on a separate process server or integration server as opposed to the SSO enabled application server. (NOTE: These users will need to know thier Tririga user name and password to sign in using this solution.)
An alternative best practice would be to set up a separate non-SAML SSO solution for non-browser client users which can support basic or NTLM authentication. (NOTE: SmartCard/CAC users would need to know their SmartCard/CAC user name and password to sign in using this solution.)
Many times, a client may hear a support engineer say that they should upgrade to the latest version. Why do I keep hearing that - especially if upgrading will take time, money, and resources? TRIRIGA, like all other software, evolves. We continuously fix defects and add new functionality. For instance, if you are on version 3.3.2, some of the features you cannot take advantage of include improved logging capabilities, which makes it easier for us to help you troubleshoot an issue. Also included is improved security and, more recently workflow versioning. Since complex software can sometimes have defects you never know when a defect might impact your business. But why wait for it to impact you? Upgrade so that it won’t happen. New functionality is also added into the software and you may want to take advantage of it.
Another reason to upgrade is that software uses various technologies, which changes faster than New England weather! Technology is constantly changing and TRIRIGA must keep up in order to keep it running. That technology can be in operating system updates, browser versions, application servers and Java to name a few. Support may often recommend staying current with product releases but it is always good to review the release notes for the current release. The release notes for each version show what has been fixed and functionality that has been added. You can find the release notes for TRIRIGA here:
Before upgrading, it is important to understand the structure of TRIRIGA because there are 2 very different procedures to upgrade. Those 2 structures are Platform and Application. The TRIRIGA Platform is the java code that IBM writes and is installed on a server. The TRIRIGA Applications are developed using the TRIRIGA Platform. Applications are not “written” but are developed using the Platform as a development tool. Applications are stored in the database as metadata and are not found in the TRIRIGA directory structure. Both the Platform and the Application have their own version number. How do I tell what number is for what? Platform versions are the lower of the 2 numbers associated with a TRIRIGA install. So the Platform version would be something like 3.5.1 or 3.4.2. Applications are the larger numbers associated with a TRIRIGA install. So the Application version would be something like 10.5.1 or 10.4.2. TRIRIGA 3.5.1/10.5.1 is Platform 3.5.1 and Application 10.5.1. But this can change. If IBM releases a 4.1.0/11.1.0 release, a client can upgrade to Platform 4.1.0 and leave the application at 10.5.1. But you can never upgrade the Application beyond the platform it was built on. So using the example of 4.1.0/11.1.0, you could not upgrade the application to 11.1.0 and still be on platform 3.5.1 because the functionality required to support the new Application exists in the new Platform.
The TRIRIGA Platform will always be there as it is required for the database to be run. The Platform never deprecates functionality. So if an application is developed on one release of the Platform, it will continue to function in future releases of the Platform. Ever since the 3.2 release, you can easily upgrade the Platform in under 2 hours. You simply stop the servers, install the new platform code, start one server to perform the database updates and when it is complete, stop the server, apply the latest fixpack, start one server and then start the rest. That’s it! The Platform will be upgraded. Security vulnerabilities, new technology, performance enhancements, new properties and more are ready to use. You should plan to perform a Platform upgrade at least once a year.
Applications can be substantially more complex. If you have never done one, I would strongly recommend that you consider engaging our IBM Global Business Service (GBS) or one of the IBM TRIRIGA certified business partners to help you through an application upgrade. Since the applications are actually data in the database, an upgrade involves updating data, which is always a tricky task. On top of that, clients have the ability to configure and modify functionality associated with an application. A wrong step could overwrite data and damage functionality. To add to it, application upgrades must be done one version at a time. If you on 10.3.1 and are going to 10.5.1, then you will need to upgrade to 10.3.2. Then 10.4.0, 10.4.1, 10.4.2, 10.5.0 and finally 10.5.1. We strongly recommend that you plan for an Application upgrade at least once every two years to minimize the number of versions between platform and application. In addition, audited functionality may require Application upgrades. Customers who use the Lease functionality in TRIRIGA will know that government rules around leases are called FASB. Clients who need to be compliant with FASB rules will need to be on the latest Application release.
Fixpacks are important too! If a defect is found in a release, we will identify the defect as an APAR and develop a fix that can be applied to the installed software. It is recommended that customers set aside time and resources once a quarter to apply any fixpacks.
When a support engineer recommends that the user should upgrade to the latest version, the first thing that should be done is plan! Just because the platform may not take a lot of time to do, you should put in the proper planning. Here are somethings to ask yourself when planning your upgrade:
ScottRuch 270004QNFJ Visits (9751)
When starting as a new user or working with a new copy of the application, there are some dependencies within the application that need to be satisfied if you want to do anything more than create a user record. In short, a newly created application has a good bit of data in it, but you will need to add more to begin using all aspects of it.
Do you want to add a Location? You'll need a Geography to assign to it.
Do you want to add an Organization? You'll need a Primary Location.
Do you want to add People? You'll want a Primary Location and Organization
The best place to start is in the Portfolio menu. Geographies, Locations, Organizations, and People form the basic building blocks for adding record data.
For Geographies, It's helpful to add at least two complete branches, one in North America and one for Europe. A basic scenario would begin with World Regions for North America and Europe and continue to City level for each branch. This allows for flexibility in scenarios involving multiple time zones, moves and other time and place based events.
2 - Locations
For Locations, in following the North America/Europe theme, it is sufficient to create two complete Location branches, much like Geography. As above, create complete branches beginning with Property and working all the way down to at least one Space record for each branch. This allows for flexibility in Space area calculation. Moves, and can also be used for Requests.
3 - Organization
There is not typically a need for as complex a structure for Organization. It is certainly possible and some scenarios will likely require it. but for basic testing purposes, a single My Company record, and a single External Company will be sufficient.
4 - People
As with the Organization structure, the users needs will dictate what should be created. Two users for the My Company record, and two users for the External Company will allow for basic testing. Remember that unlike the other records mentioned here, People have dependencies for Licenses and Security.
The linked IBM TRIRIGA Quick Start Guide will be of use.
Finally, I hope to open a dialogue on this topic so if you see this post, and you have a question please do not hesitate to ask, and we will provide you answers.
dmmckinn 1200006SCS Visits (8548)
Highlighting some Rational Publishing Engine (RPE) 2.1.0 video demonstrations that were recorded a few months ago to bring them back in the spotlight.
The following videos provide you with a walkthrough of the new features that were made available in RPE version 2.1.0
The full playlist, which includes all of the following videos, is available in
For further information about other capabilities available in RPE 2.1.0, you may want to check out
dmmckinn 1200006SCS Visits (9888)
Have you checked out the Internet of Things Continuous Engineering solution Trial?
A Configuration Management (CM) enabled Internet of Things Continuous Engineering solution trial is available for users to evaluate in the cloud.
Go over to the Cloud Trial Area and give it a try. From there you can create your own project and even invite others to join.
A list of products and features that are available to test is included in the Internet of Things Continuous Engineering solution trials are now available in the cloud blog post.
Additional information in also available in the FAQ for the Cloud Trial Systems wiki.
AcdntlPoet 2700019V2G Visits (8832)
Maximo with Watson Analytics: Introduction (Part 1) - Introduces the new Watson Analytics User Interface and features! Provides comparison to previous Watson Analytics Classic features - and focuses on new areas of Data, Discover and Display!
Continue on for the remainder of the series:
Maximo with Watson Analytics: Data (Part 2) - Highlights and demonstrates Data features of Watson Analytics, including data importing, quality and refinement. Part 2 of the demo series exploring new Watson Analytics 2.0 features.
Maximo with Watson Analytics: Discovery (Part 3) - Demonstrates discovery features of Watson Analytics, including natural language and predictive explorations, attribute influencers and individual exploration creation. Part 3of the demo series exploring new Watson Analytics 2.0 natural language and predictive features.
Maximo with Watson Analytics: Discovery Make your own charts (Part 4) - Continuing with an overview of the Discovery component, this demo recording highlights how you can add your own discovery charts in Watson Analytics.
Maximo with Watson Analytics: Display (Part 5) - Provides overview of the Display component of Watson Analytics by creating a dashboard using the data discoveries of the Reactive Work Order data set.