Rajeshavanthi 2700022MCX Visits (805)
Jazz is an initiative to transform software and systems delivery by making it more collaborative, productive and transparent, through integration of information and tasks across the phases of the lifecycle. The Jazz initiative consists of three elements: Platform, Products and Community.
Jazz Team Server (JTS) provides the foundational services which enable a group of tools to work together as a single logical server, and includes any number of Jazz Team Server Extensions that provide the tool-specific functionality. All of the foundation and tool-specific services are RESTful web services.
Technically, The Jazz Foundation Services (JFS) are a concrete set of RESTful web services – REST APIs – for dealing with user and project administration, security, collaboration, query, and other generic cross-tool capabilities.
The Jazz platform is designed to support any industry participant who wants to improve the software and systems lifecycle and break down walls between tools. The platform is built on architectural principles that represent a key departure from approaches taken in the past. Unlike the way monolithic, closed products of the past are integrated, Jazz has an innovative approach to integration based on open, flexible services and Internet architecture.
Jazz is an open platform designed to support any industry participant who wants to improve the software lifecycle and break down walls between tools.
Organizations need to flexibly assemble their software delivery environment, using their preferred tools and vendors. Since this environment is not static, organizations need to evolve their environment as their needs change at their own pace.
The Jazz platform provides the technical foundation for several types of lifecycle tool integration. This platform consists of an architecture and a set of application frameworks and toolkits as shown on the slide.
There are two principal facets of the Jazz architecture:
1) Linked Lifecycle Data, applying the W3C “Linked Data” standard to the realm of lifecycle data (e.g., requirements, change requests, test plans, code, etc.)
2)Integration services, providing cross-cutting capability for any lifecycle tool (e.g., user admin, project admin, lifecycle query) to enable the "system" of tools to work well together
The Jazz architecture addresses this problem by providing standard interfaces and methods for tools to establish links to data housed and managed by other tools, possibly those built on widely varying technologies. Jazz embraces the linked data approach as implemented in the Open Services for Lifecycle Collaboration (OSLC) initiative. Linked data is the fundamental architectural principle of OSLC, making it "the community and specifications for Linked Lifecycle Data.“
Integration services are general purpose cross-tool capabilities that enable the whole (a set of Jazz products) to be greater than the sum of its parts.
Some integration services (e.g., user identity management) provide a capability that all tools can use, delivering a predictable in-tool experience, and simplifying cross-tool interactions. Other integration services (e.g., dashboards or lifecycle project admin) are implemented by several tools, and knit together to provide a coherent overall integrated capability.
AcdntlPoet 2700019V2G Visits (523)
Rational Integration Tester in 9 minutes flat - Learn how to record a simple test, play it back, and view the results in under ten minutes. Learn more about Rational Integration Tester on IBM Knowledge Center at
Rajeshavanthi 2700022MCX Visits (855)
In a very typical scenario, To create a test, many testers currently sit at a PC and alternate between completing an action in the application under test and writing the step on a notepad. When the test is complete on paper, the tester (or in some cases an administrative assistant for the testing group) types the steps into a standard test template, frequently using either Microsoft Word or Microsoft Excel. Following this procedure, a single manual test is essentially written twice.
Eliminating the need to write each test twice can save you time for other testing.
In addition to writing tests twice, much time is spent adapting existing tests for similar new tests. For example, a tester might create a test to log into the application under test as an administrator, and then adapt that test to log into the application under test as a regular user. This adaptation typically takes one of two forms:
• Open the existing test, edit steps as needed, and save the test with a new filename.
• Open a new test template, copy steps from the existing test, edit steps as needed, and save the new test.
Reducing duplication among many tests reduces the time you spend maintaining test scripts.
Testers typically receive many new software builds during a development cycle, often at an increasingly rapid rate toward the end of the cycle.
When a build contains a new feature or a fix that requires modifying the steps of a test, all of the tests that relate to the new feature or fix must be updated to reflect the change in the application under test. Although this update process is not difficult to manage when only a few tests require updating, when dozens (or hundreds) of tests are affected by changes to a commonly used area of the application under test, such as a login screen, updating can be very time-consuming. Eliminating or reducing the work that is required to keep many tests up to date can save you time for additional testing.
As you increase the efficiency of your testing effort, you can use the saved time to conduct additional tests. Rational Quality Manager can help you work more efficiently in each of the testing activities:
• Creating tests: Rational Quality Manager helps you to manage reusable content and use it in similar tests. As a result, you spend less time authoring tests.
• Running tests: Rational Quality Manager associates text to be typed in the application under test with an execution step. Rational Quality Manager also associates verification text with a test step, and then compares the comparison. Rational Quality Manager provides test data variables so that you can define a test once and run it many times to accommodate different data input values. These features save you time typing in the application under test, comparing actual to expected results, and testing different data values.
• Reporting test results: Rational Quality Manager provides customizable reports that reflect test result data from all phases of the project. These reports save time that you would have spent manually tabulating test results for reporting.
• Maintaining tests: Rational Quality Manager reduces the amount of required test maintenance by storing reusable content only once. As a result, you spend less time updating many similar tests.
JackSchneiderCO 270007HB0G Visits (571)
LIVE TWEETING DURING THE WEBCAST! USE #GRUCSAFe TO FOLLOW AND SUBMIT QUESTIONS AND COMMENTS TO OUR SPEAKER
Are you looking for tooling to enable your Scaled Agile Framework™ (SAFe) transformation and confused about your options? Come learn about the new SAFe support in the CLM 6.0 release that allows you to set up a SAFe Program with SAFe Teams quickly and easily, providing work item types, attributes and workflows, reports, plans, and queries, and process mentoring to help drive adoption of the SAFe methodology. The solution comes out-of-the-box with IBM Rational Team Concert 6.0 (RTC) (one of three products included in CLM 6.0), providing all of the same collaborative capabilities of IBM Rational solution for Collaborative Lifecycle Management (CLM) for SAFe Programs, enabling you to support both new and existing projects, track work across multiple inter-dependent teams, and gain visibility into software development and delivery across your executing program.
Amy Silberbauer, Executive IT Specialist, Solution Specialist, Enterprise Scaled Agile, SAFe, DevOps Steer
Amy Silberbauer is an Executive IT Specialist in the IBM Rational organization. She currently serves as a Solution Specialist responsible for defining and driving Rational’s DevOps Steer and Scaled Agile solutions. She has hands on experience leading an internal SAFe (Scaled Agile Framework) transformation project, evangelizing SAFe concepts within Rational and beyond to other teams in IBM, and working with customers considering similar transformations. She is a Certified SAFe Program Consultant. She is a recognized subject matter expert on software development lifecycle solutions, including Enterprise Modernization and SOA.
Jean-Louis Marechaux, Worldwide Technical Enablement Lead for CLM-IT segment
Jean-Louis Marechaux (aka JL) is the Worldwide Technical Enablement Lead for the “CLM-IT” segment. He works for IBM Rational Software and has 18+ years of experience with software development. In his technical leadership role, JL focuses on IBM DevOps Services, Application Lifecycle Management (ALM), and Agile practices. Before joining the Rational group, Jean-Louis worked as a Solution Architect for IBM Global Business Services and other IT organizations. He has been involved in architecture, design, development, and methodologies. Follow Jean-Louis on his blog: Pragmatic Architecture - http
***Dial in codes will be sent a few minutes before the webcast and posted in the online meeting.
By registering for this webcast you are allowing the GRUC to provide your information to IBM and/or webcast sponsors for direct contact regarding IBM products and promotions. You will also receive a complimentary membership to the Global Rational User Community.
Soumya Y Shanthimohan 270004GAQS Visits (659)
While working on a specific scenario, I noticed three different playbacks had three different Page element status codes.
Below was my observation after analyzing the playback data
The Page Element Status code was 100% successful for the first run –
A bar chart that shows the following information to indicate the overall success of the run.
Here, the page element success means that the response code verification point passed for that request. If a request has no verification points, success means that the server received the request and returned a response with a status code in the 200 or 300 category or returned an expected response with a status code in the 400 or 500 category.
Here, if you notice the section ‘Server Health Summary’, the number of total page element attempts is equal to the total page element Status code success. Hence we see success rate for Page Element Status code is 100%
The Page Element Status code was 99.26% successful for the second run -
Notice the section ‘Server Health Summary’, the number of total page element attempts is NOT equal to the total page element Status code success. Hence we see success rate for Page Element Status code is 99.26%
The Page Element Status code was 84.44% successful for the third run -
Notice the section ‘Server Health Summary’, the number of total page element attempts is NOT equal to the total page element Status code success. Hence we see success rate for Page Element Status code is 84.44%