IBM Rational Functional Tester
(RFT) is a solution for testing Java, Web, Eclipse,Flex, Dojo, Ajax, SAP, Adobe, Microsoft
Visual Studio .NET WinForm based applications, etc. Support is also available
for testing 3270/5250 terminal-based applications and Siebel 7.7
Before actually starting to automate, please go thru the above link to check if your environment is supported by RFT, else RFT will not run Rational
Functional Tester plugs into IBM's open-sourced Integrated Development
Environment (IDE) known as Eclipse. By embracing IBM's IDE, Rational Functional
Tester sits along side other development tools created by IBM Rational and
other vendors allowing easy tool integration into a common interface. RFT is
the QSE(Quality Software Engineering) recommended tool for GUI automation.
Scripting with RFT
you to program in standard Java or with Visual Basic .NET. The fact
that RFT uses these programming languages provides two advantages.
advantage is that the languages are standard so the learning curve is smaller
than if the script developer had to learn both the tool and the language.
Another advantage is the flexibility the two languages offer. Programmers have
the choice of familiar languages. So they can write programs in the one at
which they are most adept or which they feel best suits their needs.
GUI test automation tools
also feature libraries of functions useful for testing, such
as click, select, or verify. In RFT, you can add to this library
("wrapping" several lines of code to perform a single operation
is one useful technique), to provide functions useful to all RFT programmers.
When developers change properties of UI elements, existing scripts potentially
can fail. How do you maintain your scripts if developers keep changing
properties that identify the user interface elements, such as the position of
the object or its name? This is an inevitable part of development, but how can
your scripts keep up?
This is one of the most
important issues to think about when selecting a tool and creating your
scripts, because if your scripts take too much effort to maintain, they cease
to be a cost-effective and efficient solution for testing.
RFT uses two technologies to
address this problem.
The core technology is the RFT Object Map feature.
RFT Object Map feature is enhanced by ScriptAssure.
The RFT Object Map stores
information about GUI objects and their properties during test development.
This information is used to find GUI objects during test execution. Some of the
properties that identify an object include color, size, position, state (such
as checked and unchecked), text label, and logical name (the name you assign to
the object). Object maps are often shared across multiple tests.
The purpose of ScriptAssure is
to eliminate the need to update the script when the objects in the user
interface change. ScriptAssure accomplishes this by allowing you to weight
the different properties used to identify a UI element. You determine the
most important characteristics for identifying the object. When one property
changes, ScriptAssure can still identify the object based on the other
properties. No single change to any object prevents an RFT script from running
Another way that
RFT reduces script maintenance is with the Object Map update
tool. The tool enables you to globally update a centralized object map.
IBM Rational Quality Manager
is collaborative, Web-based, quality management software for comprehensive test
planning and test asset management throughout the software life cycle.
on the Jazz™ platform, it is for test teams of all sizes and supports a variety
of user roles, such as test manager, test architect, test lead, tester, and lab
manager, as well as roles outside of the test organization.
Rational Quality Manager enables you to manage and run automated test scripts
created with other test tools.
test scripts that you create are references to the actual tests created in the
other testing tools.
This blog covers Worksoft Certify as an example. You
will see how to get the tools integrated and how they work with Rational
Quality Manager to help you better manage and understand the status of your
is an automated functional testing solution for SAP lifecycle management and
cross-platform business process validation.
Worksoft Certify eliminates custom
coding and programming, a requirement of most legacy test automation
products, making it fast and easy to implement and maintain. Using an
object-driven approach rather than generating scripts or code, Worksoft Certify
validates business process workflows using a data model of fields, screens, and
transactions making it easy to keep pace with dynamic changes.
Steps to change the polling interval
These are the files that are
included in our Worksoft Adapter:
There are two options to change
the polling interval:
·Option 1 – while the
adapter IS NOT RUNNING, change the config.ini “sleepTime” config value.
This is configuration for the polling interval. Then run the
adapter using “Start.cmd”
·Option 2 -
change the polling interval in the UI. Here are the
So you are smart and have a heart of Gold. You want to use your talent for social good but do not know how or where to start.
How about we, give you a chance to showcase your smartness for a social cause and let you win dollars too!
Sounds exciting? Read on.
Natural or man-made, disasters bring chaos. developerWorks gives you a chance to put your coding and testing skills to become a part of the rescue team from miles away. Participate in the contest and help ease the communication and logistics breakdown that happens when disaster stricks. Use your own idea or copy and improvize someone elses.
Every line of code you create comes with a complexity cost. How can you tame this complexity for your large source base? One way is to streamline your delivery turnaround time for enhancements and fixes by visualizing your projects' source code—after all, "a picture is worth…” Some major financial systems are driven by industry standard visual representations that allow those projects to meet agile DevOps demands. Join this session to learn about productivity gains and improve your continuous delivery of software.
Roger C. Snook Sales&TechSales Enabler: DevOps/Mobile/MobileFirst & Hybrid Cloud
Roger brings twenty five years of software product innovation and consultative engagements across several industries.
Roger is an OMG Certified UML Professional, IBM and Open Group Certified Specialist and has been a featured global speaker on Cloud Software, DevOps for Mobile, SOA, Design, and Rational software topics.
Roger is also a volunteer for the American Youth Soccer Organization that promotes a fun, family soccer experience for 1,000 kids in the Eastern Panhandle of West Virginia.
***Dial in codes will be sent a few minutes before the webcast and posted in the online meeting.
By registering for this webcast you are allowing the GRUC to provide your information to IBM and/or webcast sponsors for direct contact regarding IBM products and promotions. You will also receive a complimentary membership to the Global Rational User Community.
Model Driven Development (MDD), together with its associated UML-based tools, has been around for more than a decade now. Several advanced organizations have successfully used MDD to substantially increase their competitive edge and market share through improved productivity, quality, and time-to-market.
UML-to-COBOL transformation requires the modeling of data structures and programs. The next step in the process is the generation of the COBOL source code by using the output of a transformation. This process uses both Rational® Software Architect and Rational Developer for System z as platforms.
Generating COBOL from UML is a two-step process: modeling data structures and programs by using Rational® Software Architect and generating COBOL source from the output of this model by using Rational Developer for System z®.
Rational® Software Architect is required to begin the UML-to-COBOL process.
Note: Rational Developer for System z provides an extension that is installed with Rational Software Architect that lets you develop a COBOL model. To use this capability, download and install the "UML Profiles for COBOL Development" extension.
The final goal of the UML-to-COBOL process is to generate COBOL source code that can be enhanced further within the development environment of Rational® Developer for System z®.
At this part of the process, the system architect who has been modeling the programs and data structures has been using the capabilities of Rational System Architect, enhanced with profiles containing additional stereotypes, primitive types, and patterns. The second modeling transformation generates the output that is used with Rational Developer for System z to continue COBOL development for System z.
Rational Asset Analyzer provides in-depth insight into dependencies within and among mainframe and distributed applications. Rational Asset Analyzer can assist you with the maintenance, extension, and reuse of existing mainframe and web applications.
Rational Asset Analyzer can simplify software projects by:
delivering up-to-date knowledge of application assets from the code itself
making the same application insight available to all team members through a web browser interface
taking an inventory of mainframe and distributed application assets
analyzing the impact of a change on mainframe and distributed application assets
improving the reuse of assets and sharing of knowledge throughout the development cycle
During the inventory process for software assets in the distributed usage tool users are facing lot of challenges to offload the code from Mainframe and process the same accordingly. To overcome this RAA has been enabled to scan the source code directly from RTC repositories.
Use the Rational Team Concert™ build engine to launch and record Rational® Asset Analyzer scans to provide ongoing analysis as part of your software development and change lifecycle.
Rational Asset Analyzer software analyzes source code artifacts, such as COBOL or JCL, and subsystem information, such as resources defined in IBM® CICS® or IBM DB2® software. To analyze source code, you point Rational Asset Analyzer software to the source code and scan. You can include this source code analysis as part of your ongoing software change process. Do this by associating a Rational Team Concert build definition with a Rational Asset Analyzer scan request. The Rational Team Concert build extracts source code to a location specified by the build definition and then makes a request to Rational Asset Analyzer software to scan from that location. This solution uses standard capabilities of both products.
Please find a detailed demo on this from the following url:-
We are pleased to invite you to the Global Rational User Community (GRUC) virtual webcast on DevOps.
With the unprecedented explosion of technology around us and increasing customer expectations, speed of delivery becomes a key differentiator. Over the last few years, the importance of a DevOps approach to software delivery has been gaining more and more traction. Organizations now recognize that DevOps is a business capability that brings value to their business. They are seeking to understand how they can adopt DevOps to become more efficient, deliver higher quality product, be more agile and innovate.
Government agencies are constantly seeking ways to reduce unnecessary overheads and non-value added work and transform their operations.
This webcast will help adopt lean thinking to identify and address delivery bottlenecks.
Take this opportunity to listen to and interact with Sanjeev Sharma, a DevOps thought leader from IBM.
DevOps is a set of principles and practices designed to help development, test, and operations teams work together to deploy code more frequently and to ensure a more effective feedback loop. The practices include iterative development, deployment automation, test automation, release coordination, monitoring and optimization, and many more. This article describes the factors to consider as you build a deployment automation solution for an enterprise that has applications that run on multiple platforms, including the mainframe.
Manage the deployment of multi-platform applications
Although DevOps principles apply across all platforms, the shared nature of the IBM z/OS environment has shaped, and sometimes constrained, the deployment process. In the current z/OS environment, deployment is generally automated consistently across all environments. However, this capability cannot extend to other platforms because the tools are specific to the z/OS platform. In the z/OS environment, the tools that manage source code also provide the build and deployment capabilities. Because these tools have been in place for many years, they have been significantly customized. In the current multi-platform environment, composite applications drive the need to coordinate the deployment of the entire application across various platforms. The deployment capability in place for the z/OS environment does not coordinate well with other environments. A comprehensive and automated deployment solution is not available. At the heart of a multi-platform application deployment system, you might expect to see a consolidated inventory view, which shows you the application with all its components and subsystems, mapped to the deployment environment.
Manage the environment
A software project typically has a set of deployment environments such as development, quality assurance (QA), and production. An environment is a collection of resources as a deployment target. A resource can be a physical server, a logical partition (LPAR), a virtual machine, or a subset of a cloud. It can also be a logical deployment target, such as an IBM® CICS® region, a database, or an application server platform. The deployment system needs to understand and be able to model the environment before it can create and maintain an inventory of application versions mapped to environments. In the distributed platforms, most IT organizations use application-specific environments, but multi-tenant servers can be the targets of multiple applications. The mainframe environment is typically highly shared. Approvals and
team processes are typically scoped to environments.
CLM has been developed to support transparent, agile development in a collaborative fashion. That was the main focus in the beginning and is designed for transparency across the stakeholders. Gradually Enterprise customers wanted to give access of the Work Items to the external world in a restricted fashion. It is just a question how to implement all that without sacrificing the benefits such as usability and performance. One way of doing this is having the reverse proxy sitting outside of the firewall. Another method is to provide customers a way to reach the server(s) e.g. by providing a VPN tunnel. Or you have to punch a hole into the firewall to be able to reach the public URI's. In this article we are trying to explain one of the ways of providing external access to CLM. (through proxy)
JKE as a product company wanted to have a mechanism to expose the CCM server to their end customers to create a new Feature into the existing products list and even wanted to keep track of the progress of the same. One CLM instance/server is exclusively allocated for the JKE internal team for their product development lifecycle (Name:-CLM). The second instance of CLM will be used to give access to the external users to isolate the production server from external world (Name:-CLMEXT). One user id will be provided for each customer with restricted access to fulfill the following conditions.
One user id for each customer and one JKE id for the sales team
Each customer will get access only to their specific features and its workflow.
Only common features are accessible to all the customers.
Sales team should be in a position to respond to the queries from the customers through pre-configured responses.
Reverse proxy has to be configured in JKE reverse proxy server to provide access to the external customers. (Reverse proxy can be configured within CLM also.)
Once the features are been created in CLMEXT by the customers, JKE team can validate the same and a new WI will be added manually into the CLM server. (Note:- Work items can be added to CLM server automatically using cross server communication)
A link from the CLM WIs to CLMEXT Feature will be created and will be visible only from CLM by the JKE team.
JKE team has to manually update the status of the progress of the new Features in CLMEXT.
The complete document can be found in Rational User Group India chapter:-
Many teams find it challenging to get a project started quickly, to get team members oriented, to set up and configure tools, and to take advantage of proven patternsof success to do their jobs. Many other teams are required to document their processes for compliance reasons, and show that they follow that process.
Jazz allows us to create a new process template or customize an exisiting process template in RRC, RTC & RQM.
Jazz tools (RRC,RTC & RQM) have pre-defined process templates which can be modified according to the project needs. Process template has different components such as Overview, Timelines, Roles, Permissions, Access Control, Configuration Data and Process Description. Each of these components can be customized according to your project needs.
Managing software and product lifecycle integration has always been a challenge and with the rate of the new demands on the enterprise the challenges are increasing. Leaders from different standards organizations and industry will lead interactive discussions on the importance of open technologies to help enterprises manage the lifecycle activities within their environments. Learn about the direction lifecycle integration is taking as a result of the inclusion of open standards and the importance of this work to you. You will also hear how you can bring forward your requirements and influence the supporting work activities.
The Open Lifecycle Summit will feature short lightning talks and panel discussions with industry leaders such as OASIS CEO Laurent Liscia, Tasktop CEO Mik Kirsten, Opscode VP of Solutions George Moberly, and IBM Fellows Michael Michael Kaczmarski and Kevin Stoodley, and IBM VP of Standards and IBM Cloud Labs, Dr. Angel Diaz.
The Summit is free to attend for all those attending IBM Innovate. Join us for an exciting session and refreshments to start your attendance at Innovate 2013. For more information and to RSVP visit http://ibm.co/16jTusU