IBM Rational Quality Manager
is collaborative, Web-based, quality management software for comprehensive test
planning and test asset management throughout the software life cycle.
on the Jazz™ platform, it is for test teams of all sizes and supports a variety
of user roles, such as test manager, test architect, test lead, tester, and lab
manager, as well as roles outside of the test organization.
Rational Quality Manager enables you to manage and run automated test scripts
created with other test tools.
test scripts that you create are references to the actual tests created in the
other testing tools.
This blog covers Worksoft Certify as an example. You
will see how to get the tools integrated and how they work with Rational
Quality Manager to help you better manage and understand the status of your
is an automated functional testing solution for SAP lifecycle management and
cross-platform business process validation.
Worksoft Certify eliminates custom
coding and programming, a requirement of most legacy test automation
products, making it fast and easy to implement and maintain. Using an
object-driven approach rather than generating scripts or code, Worksoft Certify
validates business process workflows using a data model of fields, screens, and
transactions making it easy to keep pace with dynamic changes.
Steps to change the polling interval
These are the files that are
included in our Worksoft Adapter:
There are two options to change
the polling interval:
·Option 1 – while the
adapter IS NOT RUNNING, change the config.ini “sleepTime” config value.
This is configuration for the polling interval. Then run the
adapter using “Start.cmd”
·Option 2 -
change the polling interval in the UI. Here are the
Below are the methodologies which have been proven to be highly successful in RFT automation projects
Steps in designing the architecture First Step: Organize the code into a layered architecture as per the ITCL architecture
1. First Tier : AppObjects -->All the application objects of the application reside in the AppObject scripts. All the finders will be in this package/script 2. Second Tier : Tasks --> Methods around these Appobjects to do a task eg: Logging into an application 3. Third Tier: Test Cases --> The scripts which actually verify and test the application. Testcases are abstracted from the Appobjects which make this architecture robust
Here, the test cases talk to tasks and tasks in turn talk to Appobjects
Advantages of this architecture 1. Highly Organized code 2. Ease in debugging 3. If an application object changes, tester need not update all the test scripts having the appobject, but he needs only to modify the affected appobject and all the test cases are automatically updated. This is due to the fact that the test cases are abstracted from Appobjects
Second Step: For Web/Dojo Applications Using BLUE and below hacks
If the application is a Dojo application, we can further use BLUE framework on the top of ITCL. BLUE contains very robust methods for Dojo.
Also, for Web applications, if the RFT tool does not recognize the objects beneath the layers/pages, we can further use the IE developer tools to find the unique properties of the object and feed them to the RFT tool dynamic find
One can make the code browser independent by making the document page as the top parent instead of making browser as the top parent. All the underlying code will be having the document page as the top parent.
Third Step: Scripted Verifications
Use scripted verifications instead on recorded verification to verify the state of an application. it makes the automation suite very robust and resilient to the application changes For instance a verify_table method will fetch the contents of a table/grid during the first run and write it in an excel and place it in an expected results cannon. During the subsequent runs, it will again fetch the contents of a table/grid and write to an actual excel file. The method then compares the two files and logs the result
Fourth Step: Appobject Unit Tests
Keeping the Appobjects Unit Tests in place makes a confident automation code. Once the build is ready, the appobject unit tests are run to ensure all the application objects are recognized. If there is a change in object properties, the tester updates the recognition properties. This will further ease the debugging effort as we are sure the test cases are failing due to a bug/ defect or due to a functionality change and not due to object recognition
Fifth Step: Multi Client Test Script Maintenance
Using a property file can make a single test script to run on multiple clients. This is done by defining a property in the properties file which would instantiate the objects specific to that type of client mentioned.
IBM Rational Functional Tester
(RFT) is a solution for testing Java, Web, Eclipse,Flex, Dojo, Ajax, SAP, Adobe, Microsoft
Visual Studio .NET WinForm based applications, etc. Support is also available
for testing 3270/5250 terminal-based applications and Siebel 7.7
Before actually starting to automate, please go thru the above link to check if your environment is supported by RFT, else RFT will not run Rational
Functional Tester plugs into IBM's open-sourced Integrated Development
Environment (IDE) known as Eclipse. By embracing IBM's IDE, Rational Functional
Tester sits along side other development tools created by IBM Rational and
other vendors allowing easy tool integration into a common interface. RFT is
the QSE(Quality Software Engineering) recommended tool for GUI automation.
Scripting with RFT
you to program in standard Java or with Visual Basic .NET. The fact
that RFT uses these programming languages provides two advantages.
advantage is that the languages are standard so the learning curve is smaller
than if the script developer had to learn both the tool and the language.
Another advantage is the flexibility the two languages offer. Programmers have
the choice of familiar languages. So they can write programs in the one at
which they are most adept or which they feel best suits their needs.
GUI test automation tools
also feature libraries of functions useful for testing, such
as click, select, or verify. In RFT, you can add to this library
("wrapping" several lines of code to perform a single operation
is one useful technique), to provide functions useful to all RFT programmers.
When developers change properties of UI elements, existing scripts potentially
can fail. How do you maintain your scripts if developers keep changing
properties that identify the user interface elements, such as the position of
the object or its name? This is an inevitable part of development, but how can
your scripts keep up?
This is one of the most
important issues to think about when selecting a tool and creating your
scripts, because if your scripts take too much effort to maintain, they cease
to be a cost-effective and efficient solution for testing.
RFT uses two technologies to
address this problem.
The core technology is the RFT Object Map feature.
RFT Object Map feature is enhanced by ScriptAssure.
The RFT Object Map stores
information about GUI objects and their properties during test development.
This information is used to find GUI objects during test execution. Some of the
properties that identify an object include color, size, position, state (such
as checked and unchecked), text label, and logical name (the name you assign to
the object). Object maps are often shared across multiple tests.
The purpose of ScriptAssure is
to eliminate the need to update the script when the objects in the user
interface change. ScriptAssure accomplishes this by allowing you to weight
the different properties used to identify a UI element. You determine the
most important characteristics for identifying the object. When one property
changes, ScriptAssure can still identify the object based on the other
properties. No single change to any object prevents an RFT script from running
Another way that
RFT reduces script maintenance is with the Object Map update
tool. The tool enables you to globally update a centralized object map.
Rational Quality Manager has following characteristics:
Rational Quality Manager is based on the Jazz platform and inherits many characteristics from that platform
IBM Rational Quality Manager is a collaborative and web-based tool
Offers comprehensive test planning, test construction, and test artifact management functions throughout the software development lifecycle Rational Quality Manager is designed to be used by test teams
supports a variety of user roles, such as test manager, test architect, test lead, tester, and lab manager
You can download CLM 4.0 and try for 60 days from jazz.net link given below:
Once you download, you can follow the link for installing, configuring, deploying CLM 2012 from the following CLM 4.0 help link: https://jazz.net/help-dev/clm/index.jsp?re=1&scope=null
Have look at what's new in CLM 2012 from the following link by downloading PDF. It has rich set of UI, improved features, manual script recording with RFT adapter etc.
The following article shows how to integrate custom tools with RQM using Command Line Adapters, which does not provide out-of-the box Integration.
For all CLM 4.0 related queries, please join in Jazz forum by registering yourself here: https://jazz.net/pub/user/RegisterEntry.action Once your registration completes, login to Jazz.net and go to forum https://jazz.net/forum. Ask Questions, get Answers, share your thought and ideas and become expert.
The most common causes of the errors observed with Rational® Performance Tester, both during Recording & Playback.
Browser and System Port(s)
Cause 1: The browser could already be in use during recording?
Additional Details: RATIONAL® PERFORMANCE TESTER cannot record scripts starting a parallel browser session; this is because RPT needs to create a proxy server port before it can start listening to the traffic exchanged between the client and the server.
Remedy: Close all Browser sessions before attempting a recording.
Cause 2 : Proxy settings in browser might prevent RPT from getting to the browser?
Additional Details: The proxy settings in the browser might be hindering RPT take control of the browser and move ahead with recording.
Remedy: Remove all proxy settings in the browser; also disable the Automatic Configuration scripts if any referred in the proxy settings.
However in scenarios where proxy settings are mandatory, please use the steps as described in the technote
Cause 3: Port 10002 (used by RPT to interact with the in-tool agent controller might be in use) hence
Additional Details: This port is used by Rational® Performance Tester to interact with the Agent Controller (Integrated / Remote) to run load tests
Remedy: Check if 10002 is already in use, if yes, please retry after ensuring that the application using port 10002 has relieved this port. Alternatively, you can try a machine re-start to allow the port to go back to LISTENING mode.
You can use netstat command usage to confirm that 10002 is a free port.
Un-supported network traffic - protocol
Cause: If network traffic used by the application does not fall into the supported protocols category,there might be un-expected behavior.
Rational® Performance Tester includes built-in recorders for the following network protocols: HTTP SAP TCP Socket Citrix Web Services (SOA) Siebel
Remedy: Additional protocols can be built using the IBM Rational® Performance Tester Extensibility Software Development Kit (SDK), here is the link to documentation on this
Best Practices – RATIONAL® PERFORMANCE TESTER Recording
Script Maintenance is a crucial part in effective recording process to ensure that the scripts generated in RATIONAL® PERFORMANCE TESTER are optimal and are in sync with the test plans. Some of the main points to be aware of while recording are:-
Always return to the main screen or page at the end of each scenario or before a log-out. Having common beginning and ending screens is reliable
Apart from the formal change / version control processes, documenting dependencies in the environment which can impact the scripts is a major task which can help avoid deficiencies in the test environment and hence help achieve assigned targets with recording process.
Always document dependencies and other information about your split test modules. This will help with test script maintenance and re usability. Use comments in the test or schedule or by using the properties description field associated with each test
If the details provided above do not help address the concerns, please contact IBM Rational Client Support with the details on the issue.
IBM Rational® solutions for collaborative design management enable an organization to bring together a broad set of stakeholders—customers, line-of-business managers, operations, development teams, and others—to contribute to and influence the design of products, software and systems. This new level of collaboration can help to drive improved quality and achieve successful business outcomes.
Find out more about Rational Team Concert at: –http://ibm.com/rational/rtc
Explore Rational Team Concert tutorials, demos, trials, and other developer learning resources at: –https://jazz.net/projects/rational-team-concert/
Check out the ROI Calculator at –http://www.ibm.com/rational/rtc/roi Download & recommend the presentation on 'Agile Development with Jazz & Rational Team Concert'
DOWNLOAD HERE Those who had attended the session and wish to avail participatory certificatecan, Request Here Watch out for more on the subject. And feel free to give your feedback, post your doubts or share your knowledge with us.
So you are smart and have a heart of Gold. You want to use your talent for social good but do not know how or where to start.
How about we, give you a chance to showcase your smartness for a social cause and let you win dollars too!
Sounds exciting? Read on.
Natural or man-made, disasters bring chaos. developerWorks gives you a chance to put your coding and testing skills to become a part of the rescue team from miles away. Participate in the contest and help ease the communication and logistics breakdown that happens when disaster stricks. Use your own idea or copy and improvize someone elses.
Every line of code you create comes with a complexity cost. How can you tame this complexity for your large source base? One way is to streamline your delivery turnaround time for enhancements and fixes by visualizing your projects' source code—after all, "a picture is worth…” Some major financial systems are driven by industry standard visual representations that allow those projects to meet agile DevOps demands. Join this session to learn about productivity gains and improve your continuous delivery of software.
Roger C. Snook Sales&TechSales Enabler: DevOps/Mobile/MobileFirst & Hybrid Cloud
Roger brings twenty five years of software product innovation and consultative engagements across several industries.
Roger is an OMG Certified UML Professional, IBM and Open Group Certified Specialist and has been a featured global speaker on Cloud Software, DevOps for Mobile, SOA, Design, and Rational software topics.
Roger is also a volunteer for the American Youth Soccer Organization that promotes a fun, family soccer experience for 1,000 kids in the Eastern Panhandle of West Virginia.
***Dial in codes will be sent a few minutes before the webcast and posted in the online meeting.
By registering for this webcast you are allowing the GRUC to provide your information to IBM and/or webcast sponsors for direct contact regarding IBM products and promotions. You will also receive a complimentary membership to the Global Rational User Community.
Model Driven Development (MDD), together with its associated UML-based tools, has been around for more than a decade now. Several advanced organizations have successfully used MDD to substantially increase their competitive edge and market share through improved productivity, quality, and time-to-market.
UML-to-COBOL transformation requires the modeling of data structures and programs. The next step in the process is the generation of the COBOL source code by using the output of a transformation. This process uses both Rational® Software Architect and Rational Developer for System z as platforms.
Generating COBOL from UML is a two-step process: modeling data structures and programs by using Rational® Software Architect and generating COBOL source from the output of this model by using Rational Developer for System z®.
Rational® Software Architect is required to begin the UML-to-COBOL process.
Note: Rational Developer for System z provides an extension that is installed with Rational Software Architect that lets you develop a COBOL model. To use this capability, download and install the "UML Profiles for COBOL Development" extension.
The final goal of the UML-to-COBOL process is to generate COBOL source code that can be enhanced further within the development environment of Rational® Developer for System z®.
At this part of the process, the system architect who has been modeling the programs and data structures has been using the capabilities of Rational System Architect, enhanced with profiles containing additional stereotypes, primitive types, and patterns. The second modeling transformation generates the output that is used with Rational Developer for System z to continue COBOL development for System z.
Rational Asset Analyzer provides in-depth insight into dependencies within and among mainframe and distributed applications. Rational Asset Analyzer can assist you with the maintenance, extension, and reuse of existing mainframe and web applications.
Rational Asset Analyzer can simplify software projects by:
delivering up-to-date knowledge of application assets from the code itself
making the same application insight available to all team members through a web browser interface
taking an inventory of mainframe and distributed application assets
analyzing the impact of a change on mainframe and distributed application assets
improving the reuse of assets and sharing of knowledge throughout the development cycle
During the inventory process for software assets in the distributed usage tool users are facing lot of challenges to offload the code from Mainframe and process the same accordingly. To overcome this RAA has been enabled to scan the source code directly from RTC repositories.
Use the Rational Team Concert™ build engine to launch and record Rational® Asset Analyzer scans to provide ongoing analysis as part of your software development and change lifecycle.
Rational Asset Analyzer software analyzes source code artifacts, such as COBOL or JCL, and subsystem information, such as resources defined in IBM® CICS® or IBM DB2® software. To analyze source code, you point Rational Asset Analyzer software to the source code and scan. You can include this source code analysis as part of your ongoing software change process. Do this by associating a Rational Team Concert build definition with a Rational Asset Analyzer scan request. The Rational Team Concert build extracts source code to a location specified by the build definition and then makes a request to Rational Asset Analyzer software to scan from that location. This solution uses standard capabilities of both products.
Please find a detailed demo on this from the following url:-
We are pleased to invite you to the Global Rational User Community (GRUC) virtual webcast on DevOps.
With the unprecedented explosion of technology around us and increasing customer expectations, speed of delivery becomes a key differentiator. Over the last few years, the importance of a DevOps approach to software delivery has been gaining more and more traction. Organizations now recognize that DevOps is a business capability that brings value to their business. They are seeking to understand how they can adopt DevOps to become more efficient, deliver higher quality product, be more agile and innovate.
Government agencies are constantly seeking ways to reduce unnecessary overheads and non-value added work and transform their operations.
This webcast will help adopt lean thinking to identify and address delivery bottlenecks.
Take this opportunity to listen to and interact with Sanjeev Sharma, a DevOps thought leader from IBM.
DevOps is a set of principles and practices designed to help development, test, and operations teams work together to deploy code more frequently and to ensure a more effective feedback loop. The practices include iterative development, deployment automation, test automation, release coordination, monitoring and optimization, and many more. This article describes the factors to consider as you build a deployment automation solution for an enterprise that has applications that run on multiple platforms, including the mainframe.
Manage the deployment of multi-platform applications
Although DevOps principles apply across all platforms, the shared nature of the IBM z/OS environment has shaped, and sometimes constrained, the deployment process. In the current z/OS environment, deployment is generally automated consistently across all environments. However, this capability cannot extend to other platforms because the tools are specific to the z/OS platform. In the z/OS environment, the tools that manage source code also provide the build and deployment capabilities. Because these tools have been in place for many years, they have been significantly customized. In the current multi-platform environment, composite applications drive the need to coordinate the deployment of the entire application across various platforms. The deployment capability in place for the z/OS environment does not coordinate well with other environments. A comprehensive and automated deployment solution is not available. At the heart of a multi-platform application deployment system, you might expect to see a consolidated inventory view, which shows you the application with all its components and subsystems, mapped to the deployment environment.
Manage the environment
A software project typically has a set of deployment environments such as development, quality assurance (QA), and production. An environment is a collection of resources as a deployment target. A resource can be a physical server, a logical partition (LPAR), a virtual machine, or a subset of a cloud. It can also be a logical deployment target, such as an IBM® CICS® region, a database, or an application server platform. The deployment system needs to understand and be able to model the environment before it can create and maintain an inventory of application versions mapped to environments. In the distributed platforms, most IT organizations use application-specific environments, but multi-tenant servers can be the targets of multiple applications. The mainframe environment is typically highly shared. Approvals and
team processes are typically scoped to environments.
CLM has been developed to support transparent, agile development in a collaborative fashion. That was the main focus in the beginning and is designed for transparency across the stakeholders. Gradually Enterprise customers wanted to give access of the Work Items to the external world in a restricted fashion. It is just a question how to implement all that without sacrificing the benefits such as usability and performance. One way of doing this is having the reverse proxy sitting outside of the firewall. Another method is to provide customers a way to reach the server(s) e.g. by providing a VPN tunnel. Or you have to punch a hole into the firewall to be able to reach the public URI's. In this article we are trying to explain one of the ways of providing external access to CLM. (through proxy)
JKE as a product company wanted to have a mechanism to expose the CCM server to their end customers to create a new Feature into the existing products list and even wanted to keep track of the progress of the same. One CLM instance/server is exclusively allocated for the JKE internal team for their product development lifecycle (Name:-CLM). The second instance of CLM will be used to give access to the external users to isolate the production server from external world (Name:-CLMEXT). One user id will be provided for each customer with restricted access to fulfill the following conditions.
One user id for each customer and one JKE id for the sales team
Each customer will get access only to their specific features and its workflow.
Only common features are accessible to all the customers.
Sales team should be in a position to respond to the queries from the customers through pre-configured responses.
Reverse proxy has to be configured in JKE reverse proxy server to provide access to the external customers. (Reverse proxy can be configured within CLM also.)
Once the features are been created in CLMEXT by the customers, JKE team can validate the same and a new WI will be added manually into the CLM server. (Note:- Work items can be added to CLM server automatically using cross server communication)
A link from the CLM WIs to CLMEXT Feature will be created and will be visible only from CLM by the JKE team.
JKE team has to manually update the status of the progress of the new Features in CLMEXT.
The complete document can be found in Rational User Group India chapter:-