Tuesday, June 30
6:30 a.m. ET
Every line of code you create comes with a complexity cost. How can you tame this complexity for your large source base? One way is to streamline your delivery turnaround time for enhancements and fixes by visualizing your projects' source code—after all, "a picture is worth…” Some major financial systems are driven by industry standard visual representations that allow those projects to meet agile DevOps demands. Join this session to learn about productivity gains and improve your continuous delivery of software.
Roger C. Snook
Sales&TechSales Enabler: DevOps/Mobile/MobileFirst & Hybrid Cloud
Roger brings twenty five years of software product innovation and consultative engagements across several industries.
Roger is an OMG Certified UML Professional, IBM and Open Group Certified Specialist and has been a featured global speaker on Cloud Software, DevOps for Mobile, SOA, Design, and Rational software topics.
Roger is also a volunteer for the American Youth Soccer Organization that promotes a fun, family soccer experience for 1,000 kids in the Eastern Panhandle of West Virginia.
You can connect with Roger on LinkedIn at http://www.linkedin.com/in/rogersnook.
***Dial in codes will be sent a few minutes before the webcast and posted in the online meeting.
By registering for this webcast you are allowing the GRUC to provide your information to IBM and/or webcast sponsors for direct contact regarding IBM products and promotions. You will also receive a complimentary membership to the Global Rational User Community.
Model Driven Development (MDD), together with its associated UML-based tools, has been around for more than a decade now. Several advanced organizations have successfully used MDD to substantially increase their competitive edge and market share through improved productivity, quality, and time-to-market.
UML-to-COBOL transformation requires the modeling of data structures and programs. The next step in the process is the generation of the COBOL source code by using the output of a transformation. This process uses both Rational® Software Architect and Rational Developer for System z as platforms.
Generating COBOL from UML is a two-step process: modeling data structures and programs by using Rational® Software Architect and generating COBOL source from the output of this model by using Rational Developer for System z®.
Developing a COBOL model by using Rational Software Architect
Rational Software Architect is required to begin the UML-to-COBOL transformation process.
Generating COBOL source from the output of the transformation
The final goal of the UML-to-COBOL process is to generate COBOL source code that can be enhanced further within the development environment of Rational Developer for System z.
Rational® Software Architect is required to begin the UML-to-COBOL process.
Note: Rational Developer for System z provides an extension that is installed with Rational Software Architect that lets you develop a COBOL model. To use this capability, download and install the "UML Profiles for COBOL Development" extension.
The final goal of the UML-to-COBOL process is to generate COBOL source code that can be enhanced further within the development environment of Rational® Developer for System z®.
At this part of the process, the system architect who has been modeling the programs and data structures has been using the capabilities of Rational System Architect, enhanced with profiles containing additional stereotypes, primitive types, and patterns. The second modeling transformation generates the output that is used with Rational Developer for System z to continue COBOL development for System z.
Rational Asset Analyzer provides in-depth insight into dependencies within and among mainframe and distributed applications. Rational Asset Analyzer can assist you with the maintenance, extension, and reuse of existing mainframe and web applications.
Rational Asset Analyzer can simplify software projects by:
delivering up-to-date knowledge of application assets from the code itself
making the same application insight available to all team members through a web browser interface
taking an inventory of mainframe and distributed application assets
analyzing the impact of a change on mainframe and distributed application assets
improving the reuse of assets and sharing of knowledge throughout the development cycle
During the inventory process for software assets in the distributed usage tool users are facing lot of challenges to offload the code from Mainframe and process the same accordingly. To overcome this RAA has been enabled to scan the source code directly from RTC repositories.
Use the Rational Team Concert™ build engine to launch and record Rational® Asset Analyzer scans to provide ongoing analysis as part of your software development and change lifecycle.
Rational Asset Analyzer software analyzes source code artifacts, such as COBOL or JCL, and subsystem information, such as resources defined in IBM® CICS® or IBM DB2® software. To analyze source code, you point Rational Asset Analyzer software to the source code and scan. You can include this source code analysis as part of your ongoing software change process. Do this by associating a Rational Team Concert build definition with a Rational Asset Analyzer scan request. The Rational Team Concert build extracts source code to a location specified by the build definition and then makes a request to Rational Asset Analyzer software to scan from that location. This solution uses standard capabilities of both products.
Please find a detailed demo on this from the following url:-
DevOps: Think Lean - Eliminate Bottlenecks!
We are pleased to invite you to the Global Rational User Community (GRUC) virtual webcast on DevOps.
With the unprecedented explosion of technology around us and increasing customer expectations, speed of delivery becomes a key differentiator. Over the last few years, the importance of a DevOps approach to software delivery has been gaining more and more traction. Organizations now recognize that DevOps is a business capability that brings value to their business. They are seeking to understand how they can adopt DevOps to become more efficient, deliver higher quality product, be more agile and innovate.
Government agencies are constantly seeking ways to reduce unnecessary overheads and non-value added work and transform their operations.
This webcast will help adopt lean thinking to identify and address delivery bottlenecks.
Take this opportunity to listen to and interact with Sanjeev Sharma, a DevOps thought leader from IBM.
Webcast Details Registrations Open!
Date: Tuesday, 25 November 2014
Time: 6:00PM - 7:00PM India Time
Registration Link: bit.ly/1oKmGnl
IBM Worldwide Lead - DevOps Technical Sales,
IBM Software Group
DevOps is a set of principles and practices designed to help development, test, and operations teams work together to deploy code more frequently and to ensure a more effective feedback loop. The practices include iterative development, deployment automation, test automation, release coordination, monitoring and optimization, and many more. This article describes the factors to consider as you build a deployment automation solution for an enterprise that has applications that run on multiple platforms, including the mainframe.
Manage the deployment of multi-platform applications
Although DevOps principles apply across all platforms, the shared nature of the IBM z/OS environment has shaped, and sometimes constrained, the deployment process. In the current z/OS environment, deployment is generally automated consistently across all environments. However, this capability cannot extend to other platforms because the tools are specific to the z/OS platform. In the z/OS environment, the tools that manage source code also provide the build and deployment capabilities. Because these tools have been in place for many years, they have been significantly customized. In the current multi-platform environment, composite applications drive the need to coordinate the deployment of the entire application across various platforms. The deployment capability in place for the z/OS environment does not coordinate well with other environments. A comprehensive and automated deployment solution is not available. At the heart of a multi-platform application deployment system, you might expect to see a consolidated inventory view, which shows you the application with all its components and subsystems, mapped to the deployment environment.
Manage the environment
A software project typically has a set of deployment environments such as development, quality assurance (QA), and production. An environment is a collection of resources as a deployment target. A resource can be a physical server, a logical partition (LPAR), a virtual machine, or a subset of a cloud. It can also be a logical deployment target, such as an IBM® CICS® region, a database, or an application server platform. The deployment system needs to understand and be able to model the environment before it can create and maintain an inventory of application versions mapped to environments. In the distributed platforms, most IT organizations use application-specific environments, but multi-tenant servers can be the targets of multiple applications. The mainframe environment is typically highly shared. Approvals and
team processes are typically scoped to environments.
Modified on by Jose P Babu
CLM has been developed to support transparent, agile development in a collaborative fashion. That was the main focus in the beginning and is designed for transparency across the stakeholders. Gradually Enterprise customers wanted to give access of the Work Items to the external world in a restricted fashion. It is just a question how to implement all that without sacrificing the benefits such as usability and performance. One way of doing this is having the reverse proxy sitting outside of the firewall. Another method is to provide customers a way to reach the server(s) e.g. by providing a VPN tunnel. Or you have to punch a hole into the firewall to be able to reach the public URI's. In this article we are trying to explain one of the ways of providing external access to CLM. (through proxy)
JKE as a product company wanted to have a mechanism to expose the CCM server to their end customers to create a new Feature into the existing products list and even wanted to keep track of the progress of the same. One CLM instance/server is exclusively allocated for the JKE internal team for their product development lifecycle (Name:-CLM). The second instance of CLM will be used to give access to the external users to isolate the production server from external world (Name:-CLMEXT). One user id will be provided for each customer with restricted access to fulfill the following conditions.
One user id for each customer and one JKE id for the sales team
Each customer will get access only to their specific features and its workflow.
Only common features are accessible to all the customers.
Sales team should be in a position to respond to the queries from the customers through pre-configured responses.
Reverse proxy has to be configured in JKE reverse proxy server to provide access to the external customers. (Reverse proxy can be configured within CLM also.)
Once the features are been created in CLMEXT by the customers, JKE team can validate the same and a new WI will be added manually into the CLM server. (Note:- Work items can be added to CLM server automatically using cross server communication)
A link from the CLM WIs to CLMEXT Feature will be created and will be visible only from CLM by the JKE team.
JKE team has to manually update the status of the progress of the new Features in CLMEXT.
The complete document can be found in Rational User Group India chapter:-
Did you know? Your experiments on Bluemix can stand you a chance to win movie tickets?
You heard it right! All you need to do is build an app on Bluemix, Share it on your social network and wait to get lucky.
Modified on by sumasr
Many teams find it challenging to get a project started quickly, to get team members oriented, to set up and configure tools, and to take advantage of proven patternsof success to do their jobs. Many other teams are required to document their processes for compliance reasons, and show that they follow that process.
Jazz allows us to create a new process template or customize an exisiting process template in RRC, RTC & RQM.
Jazz tools (RRC,RTC & RQM) have pre-defined process templates which can be modified according to the project needs. Process template has different components such as Overview, Timelines, Roles, Permissions, Access Control, Configuration Data and Process Description. Each of these components can be customized according to your project needs.
Read through this article "How to customize process and project templates in Rational Requirements Composer 4.0" to know about the steps to customize process templates in Jazz and how these templates can be shared across the project team. Article also describes the benefits of cusomizing the process templates and project templates in RRC.
Although, article talks about Customizing RRC process templates, same approach is applied for customizing Rational Team Concert and Rational Quality Manager Process Templates.
Managing software and product lifecycle integration has always been a challenge and with the rate of the new demands on the enterprise the challenges are increasing. Leaders from different standards organizations and industry will lead interactive discussions on the importance of open technologies to help enterprises manage the lifecycle activities within their environments. Learn about the direction lifecycle integration is taking as a result of the inclusion of open standards and the importance of this work to you. You will also hear how you can bring forward your requirements and influence the supporting work activities.
The Open Lifecycle Summit will feature short lightning talks and panel discussions with industry leaders such as OASIS CEO Laurent Liscia, Tasktop CEO Mik Kirsten, Opscode VP of Solutions George Moberly, and IBM Fellows Michael Michael Kaczmarski and Kevin Stoodley, and IBM VP of Standards and IBM Cloud Labs, Dr. Angel Diaz.
The Summit is free to attend for all those attending IBM Innovate. Join us for an exciting session and refreshments to start your attendance at Innovate 2013. For more information and to RSVP visit http://ibm.co/16jTusU
Blog body goes here
I'm working on.........
I am Guru Prasad, working
for Tata Consultancy Services. I am working as Rational Product Specialist
& part of TCS Rational CoE team. I have conducted several internal
webinars, workshops on various Rational tools including Rational Software
Architect, Rational Requirement Composer, Rational Requisite Pro etc. I always
like to develop custom extensions for Rational tools.
keep myself skilled & updated on rational trends
I always focus on Self educating myself on
IBM Rational tools. Most of the time I spend browsing IBM Jazz.net, IBM
Developer Works & go through the latest releases of the tools, discussion
forums, articles on Rational tools. Also, I attend most of the webinars
conducted by IBM Rational User Group. I do write Rational articles & publish
them in the internal sites. I also assist my team on POC activities / publishing
white papers on Rational in External forums like IBM Developerworks, jazz.net
Since most of the Rational tools
are Eclipse based, Rational users can develop custom extensions easily using the
Eclipse Plugin development features. IBM provides REST APIs , Java APIs to
extend features of Jazz based tools. Users can connect to jazz.net, RUG forum
& discuss on the queries specific to such implementation.
I am an active member of IBM Rational User
Group & IBM Certified Deployment Professional & Solution Designer. I
have developed several plugins for Rational tools on demand.
Below are the methodologies which have been proven to be highly successful in RFT automation projects
Steps in designing the architectureFirst Step: Organize the code into a layered architecture as per the ITCL architecture
1. First Tier : AppObjects -->All the application objects of the application reside in the AppObject scripts. All the finders will be in this package/script
2. Second Tier : Tasks --> Methods around these Appobjects to do a task eg: Logging into an application
3. Third Tier: Test Cases --> The scripts which actually verify and test the application. Testcases are abstracted from the Appobjects which make this architecture robust
Here, the test cases talk to tasks and tasks in turn talk to Appobjects
Advantages of this architecture
1. Highly Organized code
2. Ease in debugging
3. If an application object changes, tester need not update all the test scripts having the appobject, but he needs only to modify the affected appobject and all the test cases are automatically updated. This is due to the fact that the test cases are abstracted from Appobjects
More info on ITCL framework is found in this article: http://www.ibm.com/developerworks/rational/library/06/0822_goel/
Second Step: For Web/Dojo Applications Using BLUE and below hacks
Third Step: Scripted Verifications
- If the application is a Dojo application, we can further use BLUE framework on the top of ITCL. BLUE contains very robust methods for Dojo.
- Also, for Web applications, if the RFT tool does not recognize the objects beneath the layers/pages, we can further use the IE developer tools to find the unique properties of the object and feed them to the RFT tool dynamic find
- One can make the code browser independent by making the document page as the top parent instead of making browser as the top parent. All the underlying code will be having the document page as the top parent.
Use scripted verifications instead on recorded verification to verify the state of an application. it makes the automation suite very robust and resilient to the application changesFourth Step: Appobject Unit Tests
For instance a verify_table method will fetch the contents of a table/grid during the first run and write it in an excel and place it in an expected results cannon. During the subsequent runs, it will again fetch the contents of a table/grid and write to an actual excel file. The method then compares the two files and logs the result
Keeping the Appobjects Unit Tests in place makes a confident automation code. Once the build is ready, the appobject unit tests are run to ensure all the application objects are recognized. If there is a change in object properties, the tester updates the recognition properties. This will further ease the debugging effort as we are sure the test cases are failing due to a bug/ defect or due to a functionality change and not due to object recognitionFifth Step: Multi Client Test Script Maintenance
Using a property file can make a single test script to run on multiple clients. This is done by defining a property in the properties file which would instantiate the objects specific to that type of client mentioned.
IBM Rational Functional Tester
(RFT) is a solution for testing Java, Web, Eclipse,Flex, Dojo, Ajax, SAP, Adobe, Microsoft
Visual Studio .NET WinForm based applications, etc. Support is also available
for testing 3270/5250 terminal-based applications and Siebel 7.7
Here is the link where one can find the supported domains for functional testing by RFThttp://www-01.ibm.com/support/docview.wss?uid=swg27027053
Before actually starting to automate, please go thru the above link to check if your environment is supported by RFT, else RFT will not run
Functional Tester plugs into IBM's open-sourced Integrated Development
Environment (IDE) known as Eclipse. By embracing IBM's IDE, Rational Functional
Tester sits along side other development tools created by IBM Rational and
other vendors allowing easy tool integration into a common interface. RFT is
the QSE(Quality Software Engineering) recommended tool for GUI automation.
Scripting with RFT
you to program in standard Java or with Visual Basic .NET. The fact
that RFT uses these programming languages provides two advantages.
advantage is that the languages are standard so the learning curve is smaller
than if the script developer had to learn both the tool and the language.
- Another advantage is the flexibility the two languages offer. Programmers have
the choice of familiar languages. So they can write programs in the one at
which they are most adept or which they feel best suits their needs.
GUI test automation tools
also feature libraries of functions useful for testing, such
as click, select, or verify. In RFT, you can add to this library
("wrapping" several lines of code to perform a single operation
is one useful technique), to provide functions useful to all RFT programmers.
When developers change properties of UI elements, existing scripts potentially
can fail. How do you maintain your scripts if developers keep changing
properties that identify the user interface elements, such as the position of
the object or its name? This is an inevitable part of development, but how can
your scripts keep up?
This is one of the most
important issues to think about when selecting a tool and creating your
scripts, because if your scripts take too much effort to maintain, they cease
to be a cost-effective and efficient solution for testing.
RFT uses two technologies to
address this problem.
- The core technology is the RFT Object Map feature.
RFT Object Map feature is enhanced by ScriptAssure.
The RFT Object Map stores
information about GUI objects and their properties during test development.
This information is used to find GUI objects during test execution. Some of the
properties that identify an object include color, size, position, state (such
as checked and unchecked), text label, and logical name (the name you assign to
the object). Object maps are often shared across multiple tests.
The purpose of ScriptAssure is
to eliminate the need to update the script when the objects in the user
interface change. ScriptAssure accomplishes this by allowing you to weight
the different properties used to identify a UI element. You determine the
most important characteristics for identifying the object. When one property
changes, ScriptAssure can still identify the object based on the other
properties. No single change to any object prevents an RFT script from running
Another way that
RFT reduces script maintenance is with the Object Map update
tool. The tool enables you to globally update a centralized object map.
Rational Quality Manager has following characteristics:
- Rational Quality Manager is based on the Jazz platform and inherits many characteristics from that platform
- IBM Rational Quality Manager is a collaborative and web-based tool
- Offers comprehensive test planning, test construction, and test artifact management functions throughout the software development lifecycle Rational Quality Manager is designed to be used by test teams
- supports a variety of user roles, such as test manager, test architect, test lead, tester, and lab manager
You can download CLM 4.0 and try for 60 days from jazz.net link given below:
Once you download, you can follow the link for installing, configuring, deploying CLM 2012 from the following CLM 4.0 help link:
Have look at what's new in CLM 2012 from the following link by downloading PDF. It has rich set of UI, improved features, manual script recording with RFT adapter etc.
The following article shows how to integrate custom tools with RQM using Command Line Adapters, which does not provide out-of-the box Integration.
For all CLM 4.0 related queries, please join in Jazz forum by registering yourself here:
Once your registration completes, login to Jazz.net and go to forum https://jazz.net/forum. Ask Questions, get Answers, share your thought and ideas and become expert.