Romain_Barth 2700076HKB Visits (6618)
Is there a way to delete multiple linksets from a link module including all the contained links?
The DOORS UI does not provide that feature but it is possible by using DXL.
Here is a sample that will delete all the linksets and links contained in a specific project:
First, open your link module in Exclusive Edit mode, then run the script above.
There is a potential issue with the shutdown process for TRIRIGA. This is specific to versions prior to TRIRIGA 3.5.1 running under Windows with Websphere Liberty Profile (WLP). We have found that the SHUTDOWN.BAT file may not end the session. If you encounter this issue you can resolve it by adding the path as described below.
In TRIRIGA 3.5.1 for both the RUN.BAT and SHUTDOWN.BAT we include a step to change directory to the location of the server.bat file as below.
(Note: In notepad there are no spaces or carriage returns between the rows.)
@echo offset JAVA
It looks like this in other viewers
echo JAVA_HOME : %JAVA_HOME%
echo CLASSPATH : %CLASSPATH%
cd /d /hom
server.bat start tririgaServer
cd /d /hom
server.bat stop tririgaServer
However, in previous versions that path is not included:
TRIRIGA 3.4.2 (shutdown.bat)
server.bat stop tririgaServer
TRIRIGA 3.5 (shutdown.bat)
server.bat stop tririgaServer
If you encounter an issue where the shutdown.bat file is not working you can resolve the issue by adding the path to the server.bat file similar to the example below.
cd /d /your_path_to tririga/wlp/bin
server.bat stop tririgaServer
JohnONeill 270004JPGF Visits (6048)
We often get questions regarding our plans for End of Support (EOS) for TRIRIGA versions. I thought I’d share some general information and a couple of useful links that will help customers and Business Partners with planning their future activities.
We typically announce an EOS date 5 years after the Release. Please note that this is not a guarantee, commitment or a promise. It is a guideline that we try to follow. There are sometimes exceptions that require retirement of a version before the originally expected date. This can occur due to technology changes outside of our control, security issues or other reasons. When this happens we make that announcement as soon as possible.
Please see the TRIRIGA Supported Versions information at the following link for the latest information.
For general convenience here is the link to the IBM Software Support Lifecycle site. Here you can search for similar information for any IBM product.
First off, some background. I have been using and troubleshooting Data Integrator related issues for several years. It's a good tool, and quite handy for importing and updating data in the IBM TRIRIGA application. That being said, Support has received quite a bit of traffic with regard to using this tool and so I thought I would provide my take and open a dialogue for commentary.
The most pervasive issue I have seen overall has been problems with the source sheet. Typically, I start my Data Integrator process by building a spreadsheet with the column headers using the TRIRIGA application. Starting from the Data Integrator interface, reached from Home > Tools > Data Integrator interface using the Create Header action you can generate a base sheet. It's as easy as selecting the fields you want to use and exporting a sheet to begin working with. You also have the option to simply open Excel and type in the fields you want to use.
A known issue is encountered when copying and pasting from the application into Excel. In fact, this is one of the key points I want to make. Copying data from the TRIRIGA application or from any other tool, into Excel, will almost invariably introduce formatting into the spreadsheet. This is the most common cause of issues with the upload process. HTML formatting information will cause problems with the upload.
The method I have found that aids in getting around this restriction is to use the Copy/Paste VALUES option when pasting data into the spreadsheet. This removes the formatting tags, and allows for a clean upload. At times, I have experimented with copying and pasting an entire spreadsheet into a new sheet, again using the VALUES option, to clean up an upload sheet prior to saving as a text file. This yields good results and has solved many issues for me.
Another area where I have encountered issues in the past is when trying to make edits to the text file after exporting it from Excel. I strongly recommend that this NOT be done. If any edits are warranted, please make the changes in the spreadsheet and re-export the text file. In fact, I would recommend deleting the original text file and doing a fresh export each time any edits are made. This eliminates the possibility of bringing in bad data, or merging unexpected edits.
Source data can introduce issues as well. Flat file outputs from third-party software may not contain all of the data need for either upload or update. Missing column data can cause missing rows or mismatched data post-upload. Also, checking and verifying the input data can be tedious. If there is any doubt about the source data, I would recommend using one of the other integration methods where you can engage workflow to trap erroneous or missing data.
Because Data Integrator uses a flat file transfer methodology, there is very little in the way of error trapping involved, and there are no real trigger points for validation workflows to check the data prior to saving the data. If your source data is in doubt in any way, I have to recommend using Business Connect or Data Connect to bring your data in.
I welcome any comments or suggestions for extending this post, and Happy Integrating!
SUPPORT NOTIFICATION for non-browser TRIRIGA clients such as CAD Integrator, BIM, and Microsoft Outlook add-in
JeffLong 270005B0Q4 Visits (10852)
IBM TRIRIGA does not support SAML (Security Assertion Markup Language) or credential-less login mechanisms such as SmartCard or CAC (Common Access Card) as a method of authentication for its non-browser clients such as CAD Integrator, BIM, and the Microsoft Outlook add-in.
SSO solutions need to provide a mechanism for basic authentication as per the documentation in the "Requirements for single sign-on requests in the TRIRIGA Application Platform" for non-browser clients. SAML and SmartCard or CAC do not support basic authentication for non-browser based clients.
The best practice if using SAML or SmartCard/CAC, is to authenticate directly to Tririga on a separate process server or integration server as opposed to the SSO enabled application server. (NOTE: These users will need to know thier Tririga user name and password to sign in using this solution.)
An alternative best practice would be to set up a separate non-SAML SSO solution for non-browser client users which can support basic or NTLM authentication. (NOTE: SmartCard/CAC users would need to know their SmartCard/CAC user name and password to sign in using this solution.)
Many times, a client may hear a support engineer say that they should upgrade to the latest version. Why do I keep hearing that - especially if upgrading will take time, money, and resources? TRIRIGA, like all other software, evolves. We continuously fix defects and add new functionality. For instance, if you are on version 3.3.2, some of the features you cannot take advantage of include improved logging capabilities, which makes it easier for us to help you troubleshoot an issue. Also included is improved security and, more recently workflow versioning. Since complex software can sometimes have defects you never know when a defect might impact your business. But why wait for it to impact you? Upgrade so that it won’t happen. New functionality is also added into the software and you may want to take advantage of it.
Another reason to upgrade is that software uses various technologies, which changes faster than New England weather! Technology is constantly changing and TRIRIGA must keep up in order to keep it running. That technology can be in operating system updates, browser versions, application servers and Java to name a few. Support may often recommend staying current with product releases but it is always good to review the release notes for the current release. The release notes for each version show what has been fixed and functionality that has been added. You can find the release notes for TRIRIGA here:
Before upgrading, it is important to understand the structure of TRIRIGA because there are 2 very different procedures to upgrade. Those 2 structures are Platform and Application. The TRIRIGA Platform is the java code that IBM writes and is installed on a server. The TRIRIGA Applications are developed using the TRIRIGA Platform. Applications are not “written” but are developed using the Platform as a development tool. Applications are stored in the database as metadata and are not found in the TRIRIGA directory structure. Both the Platform and the Application have their own version number. How do I tell what number is for what? Platform versions are the lower of the 2 numbers associated with a TRIRIGA install. So the Platform version would be something like 3.5.1 or 3.4.2. Applications are the larger numbers associated with a TRIRIGA install. So the Application version would be something like 10.5.1 or 10.4.2. TRIRIGA 3.5.1/10.5.1 is Platform 3.5.1 and Application 10.5.1. But this can change. If IBM releases a 4.1.0/11.1.0 release, a client can upgrade to Platform 4.1.0 and leave the application at 10.5.1. But you can never upgrade the Application beyond the platform it was built on. So using the example of 4.1.0/11.1.0, you could not upgrade the application to 11.1.0 and still be on platform 3.5.1 because the functionality required to support the new Application exists in the new Platform.
The TRIRIGA Platform will always be there as it is required for the database to be run. The Platform never deprecates functionality. So if an application is developed on one release of the Platform, it will continue to function in future releases of the Platform. Ever since the 3.2 release, you can easily upgrade the Platform in under 2 hours. You simply stop the servers, install the new platform code, start one server to perform the database updates and when it is complete, stop the server, apply the latest fixpack, start one server and then start the rest. That’s it! The Platform will be upgraded. Security vulnerabilities, new technology, performance enhancements, new properties and more are ready to use. You should plan to perform a Platform upgrade at least once a year.
Applications can be substantially more complex. If you have never done one, I would strongly recommend that you consider engaging our IBM Global Business Service (GBS) or one of the IBM TRIRIGA certified business partners to help you through an application upgrade. Since the applications are actually data in the database, an upgrade involves updating data, which is always a tricky task. On top of that, clients have the ability to configure and modify functionality associated with an application. A wrong step could overwrite data and damage functionality. To add to it, application upgrades must be done one version at a time. If you on 10.3.1 and are going to 10.5.1, then you will need to upgrade to 10.3.2. Then 10.4.0, 10.4.1, 10.4.2, 10.5.0 and finally 10.5.1. We strongly recommend that you plan for an Application upgrade at least once every two years to minimize the number of versions between platform and application. In addition, audited functionality may require Application upgrades. Customers who use the Lease functionality in TRIRIGA will know that government rules around leases are called FASB. Clients who need to be compliant with FASB rules will need to be on the latest Application release.
Fixpacks are important too! If a defect is found in a release, we will identify the defect as an APAR and develop a fix that can be applied to the installed software. It is recommended that customers set aside time and resources once a quarter to apply any fixpacks.
When a support engineer recommends that the user should upgrade to the latest version, the first thing that should be done is plan! Just because the platform may not take a lot of time to do, you should put in the proper planning. Here are somethings to ask yourself when planning your upgrade:
ScottRuch 270004QNFJ Visits (9743)
When starting as a new user or working with a new copy of the application, there are some dependencies within the application that need to be satisfied if you want to do anything more than create a user record. In short, a newly created application has a good bit of data in it, but you will need to add more to begin using all aspects of it.
Do you want to add a Location? You'll need a Geography to assign to it.
Do you want to add an Organization? You'll need a Primary Location.
Do you want to add People? You'll want a Primary Location and Organization
The best place to start is in the Portfolio menu. Geographies, Locations, Organizations, and People form the basic building blocks for adding record data.
For Geographies, It's helpful to add at least two complete branches, one in North America and one for Europe. A basic scenario would begin with World Regions for North America and Europe and continue to City level for each branch. This allows for flexibility in scenarios involving multiple time zones, moves and other time and place based events.
2 - Locations
For Locations, in following the North America/Europe theme, it is sufficient to create two complete Location branches, much like Geography. As above, create complete branches beginning with Property and working all the way down to at least one Space record for each branch. This allows for flexibility in Space area calculation. Moves, and can also be used for Requests.
3 - Organization
There is not typically a need for as complex a structure for Organization. It is certainly possible and some scenarios will likely require it. but for basic testing purposes, a single My Company record, and a single External Company will be sufficient.
4 - People
As with the Organization structure, the users needs will dictate what should be created. Two users for the My Company record, and two users for the External Company will allow for basic testing. Remember that unlike the other records mentioned here, People have dependencies for Licenses and Security.
The linked IBM TRIRIGA Quick Start Guide will be of use.
Finally, I hope to open a dialogue on this topic so if you see this post, and you have a question please do not hesitate to ask, and we will provide you answers.
dmmckinn 1200006SCS Visits (8535)
Highlighting some Rational Publishing Engine (RPE) 2.1.0 video demonstrations that were recorded a few months ago to bring them back in the spotlight.
The following videos provide you with a walkthrough of the new features that were made available in RPE version 2.1.0
The full playlist, which includes all of the following videos, is available in
For further information about other capabilities available in RPE 2.1.0, you may want to check out
dmmckinn 1200006SCS Visits (9882)
Have you checked out the Internet of Things Continuous Engineering solution Trial?
A Configuration Management (CM) enabled Internet of Things Continuous Engineering solution trial is available for users to evaluate in the cloud.
Go over to the Cloud Trial Area and give it a try. From there you can create your own project and even invite others to join.
A list of products and features that are available to test is included in the Internet of Things Continuous Engineering solution trials are now available in the cloud blog post.
Additional information in also available in the FAQ for the Cloud Trial Systems wiki.
AcdntlPoet 2700019V2G Visits (8821)
Maximo with Watson Analytics: Introduction (Part 1) - Introduces the new Watson Analytics User Interface and features! Provides comparison to previous Watson Analytics Classic features - and focuses on new areas of Data, Discover and Display!
Continue on for the remainder of the series:
Maximo with Watson Analytics: Data (Part 2) - Highlights and demonstrates Data features of Watson Analytics, including data importing, quality and refinement. Part 2 of the demo series exploring new Watson Analytics 2.0 features.
Maximo with Watson Analytics: Discovery (Part 3) - Demonstrates discovery features of Watson Analytics, including natural language and predictive explorations, attribute influencers and individual exploration creation. Part 3of the demo series exploring new Watson Analytics 2.0 natural language and predictive features.
Maximo with Watson Analytics: Discovery Make your own charts (Part 4) - Continuing with an overview of the Discovery component, this demo recording highlights how you can add your own discovery charts in Watson Analytics.
Maximo with Watson Analytics: Display (Part 5) - Provides overview of the Display component of Watson Analytics by creating a dashboard using the data discoveries of the Reactive Work Order data set.
JeffLong 270005B0Q4 Visits (8231)
In order for the Maximo Tririga Integrator to work properly and have updates done in Tririga be updated in Maximo a Maximo System Property must be set. This property name is, "mxe
More information on this Maximo system property can be found here: http
To set the system property, you can follow the steps below:
Once this update is complete, test your Maximo Tririga integration to make sure updates now work as they should. If they do not, please contact support.
AcdntlPoet 2700019V2G Visits (9448)
Introduction to Maximo Work Centers (Part 1): Business Analyst- Starting with the Maximo 184.108.40.206 release, innovative, visual work centers are available enabling Maximo users to view and act on a variety of Maximo data and actions. This video introduces Abby, a Business Analyst. In her work center, she analyzes Maximo business data thru KPIs, Charts and the powerful new integration to Watson Analytics! Created by Pam Denny, IBM Analytics Architect
Continue on to the rest of the series below, or simply play next in the embedded video above:
Maximo Work Center: Business Analyst (Part 2): Exporting data to IBM Watson - This video recording is the 2nd in a series for the Maximo Business Analyst Persona, Abby. This video explores the the delivered data sets focusing on critical Maximo business areas. It then details how the data sets can be exported to csv files - or exported to IBM's Watson Analytics. Once the Maximo data is in Watson, you can utilize the life changing data quality, refinement, exploration and other components of Watson! Created by Pam Denny, IBM Analytics Architect
Maximo Work Center: Business Analyst (Part 3): Data Quality in IBM Watson - This video recording is the 3rd in a series for the Maximo Business Analyst Persona, Abby. It explores a Maximo data set's quality in Watson, and reviews the 1 to 100 data grading system. It then highlights how individual attributes within the data set are grading - and why data quality is critical to the success of your business. Demo by Pam Denny, IBM Analytics Architect
Maximo Work Center: Business Analyst (Part 4): Data Exploration in IBM Watson - This video recording is the 4th in a series for the Maximo Business Analyst Persona, Abby. In this video, the 'Explore' capability of Watson is demonstrated. Utilizing Watson's Natural Language features, Abby, the Business Analyst is lead thru an exploration of her Maximo data sets. A variety of questions are presented to her in which she can quickly select to view Maximo data visually and dynamically. Abby can modify chart types, add filters and save content for future re-u
Maximo Work Center: Business Analyst (Part 5): Data Influencers in IBM Watson - In part 5 of this series, the predictive component of Watson is highlighted with the reactive work order data set. Influencers of individual data attributes are identified enabling a business analyst additional data points for her analysis. Demo by Pam Denny, IBM Analytics Architect
Maximo Work Center: Business Analyst (Part 6): Data Assembly in IBM Watson - The last component of the video series highlights the Assembly component in Watson. Using the reactive work order data set exported from Maximo to Watson, this video highlights how you can quickly create visual, dynamic dashboards with new or saved chart content. You can then save and share the data analysis content. Demo by Pam Denny, IBM Analytics Architect
doboski 310000SJR4 Visits (6914)
Have you have ever had performance issues with loading data into your location hierarchies? Or making large changes to hierarchical data? Are you reorganizing your company, adding new departments, moving or combining others? Is it taking a long time to process these changes?
When an update is made to the hierarchy, the entire tree is rebuilt. So if you have multiple updates you are making, it is rebuilding the entire tree. If you have a rather large tree with many layers or branches, this could be quite time consuming and frustrating while you wait for it to update. Do not fear! There are some things that can be done to make it less time consuming!
One of the things to look at is in your Admin Console. You would go to Cache Manager and look for System Cache Processing Mode. By default this is set to Normal.
You would want to set this to Data Load Mode and then click on Change Cache Processing. It is worth noting that when using Data Load Mode, it will not update the tree but it will be faster to process because it is not updating the tree after every single update. Once the process is done, the tree can be rebuilt once and not after every update.
You don’t have to necessarily go into the Console to set that every time you are adding something into a hierarchy or making an update. If you have a workflow that is currently used to process your hierarchy inserts and/or updates, you can add a custom task to turn on Data Load Mode and then turn it back to Normal after your processing is complete.
To set it to Data Load Mode, in the custom task, you would set the class name to com.
To set it back to normal, in the custom task, you would set the class name to com.
For additional information regarding custom tasks, you can reference the following wiki:
AcdntlPoet 2700019V2G Visits (10157)
IBM is making support content even better, with your help! Our new "Community Discussion" feature is now live on select support content.
dmmckinn 1200006SCS Visits (10324)
Information regarding planned End of Support (EOS) dates for IBM products is documented in Software Support Lifecycle. The Announcement letters are also published to provide advanced notification of upcoming EOS updates.
EOS information is available in advance of planned dates to provide you with sufficient time to start planning your upgrades to newer product versions. When planning your upgrades, you should review the product versions you currently have installed and check to see if they have reached (or are nearing) their end of support. Below is a list of the most recently announced End of Support dates for some of our Watson Internet of Things products:
Further information about the withdrawal from support dates for the above listed products is available in Software withdrawal and service discontinuance: IBM Middleware, IBM Security, IBM Analytics, IBM Storage Software, and IBM z Systems select products - Some replacements available.
To help you get a better understanding of the stages of the IBM Product Lifecycle, including End of Support, refer to this video: Wha
AcdntlPoet 2700019V2G Visits (10012)
IBM Service Request Quick Start - This 3 video playlist, beginning with the topic "Site Technical Contact 101" is provided to help you navigate the Service Request tool on IBM.com for opening PMRs electronically. The next two videos on the list are: "Using IBM Service Request on your mobile device" and "Creating reports about software service requests with Service Request (SR)". Start with the first in the series below and follow the prompts at the end to continue watching the rest in the same window:
AcdntlPoet 2700019V2G Visits (7167)
World of Watson incorporates the kind of information you gained from IBM Insight — the tools to get the best out of your data — and raises the game. You’ll see how Watson’s capabilities give you an unprecedentedly broad view of your business, its competitive landscape and what it takes to make your customers act.
Key topics include:
EXPERIENCE innovation you can touch!
Where else will you will see a cognitive dress, use your own cognitive concierge or kick the tires of a Watson self-driving car? Get inspired by case studies and groundbreaking research, then see what Watson can do for your business and its customers.
Register today so you don't miss out: http
Romain_Barth 2700076HKB Visits (9890)
When DOORS 9 is integrated with Rational Quality Manager, Rational Design Manager or Rational DOORS Next Generation, it uses Discovery Links to link DOORS requirement to other Artifacts.
This means that DOORS does not store any of those links but it queries the other tools to know if there are links to DOORS requirements. Here is more details about discovery links : http
Baseline is one of the main feature in DOORS. As those OSLC links are not stored in DOORS, what about those links in baselines ?
Unfortunately those OSLC links are not displayed in DOORS baseline:
Baselined objects have different URIs than the original object URIs. When creating a baseline in a module, DOORS does not create new links from the baselined objects to the OSLC provider. Even if DOORS created the links, there is no way to make these links immu
To summarize, OSLC integration doesn't support baselines.
JeffLong 270005B0Q4 Visits (8142)
If you have installed IBM Tririga with WebSphere Liberty Profile, you may find a need in the future to update the JAVA version you are running in WebSphere Liberty Profile. If you need guidance on how to do so, please refer to the following documentation:
If you need additional assistance please contact support.
AcdntlPoet 2700019V2G Visits (9573)
IBM Maximo Supervisor Work Center - New Maximo Work Centers are an innovative approach to work management. Learn how the new supervisor work center enables you to visualize work, optimize work in process, focus on flow, and enable continuous improvement in your organization.