We moved to deverloperWorks platform for our blog. We believe our readers found this blog useful. Please share with us your comments and suggestions for the blog content and also any specific topics in requirements management that you want us to focus on in 2013....
Last but not least, as we mentioned in the last post, call for speakers in now open for Innovate 2013 - The IBM Technical Summit. Submit your abstracts before January 14 2013 to stand a chance of presenting in a conference with 4000+ footfalls and a free conference pass!
Innovate 2013 – The IBM Technical Summit is here. The 2013 event promises to be even more exciting with top-notch keynotes, over 450 breakout sessions, labs, certifications and our biggest exhibit hall ever. As in previous events, Requirements Management is one of the key areas of interest at Innovate which attracts speakers and attendees from across the globe representing a wide range of industries. In 2012, we had two tracks for Requirements Management with sixteen sessions each with one track focusing on IT and another focusing on Systems Engineering. We had 14 real life case studies, 2 panel discussions and 4 instructor led sessions.
Managing requirements has always been a cornerstone in both software and systems development. The importance of the discipline continues to grow and is expected to take a leading role in the coming years. This is an opportunity to showcase your thoughts on the discipline, and how requirements management tools like DOORS or Requirements Composer can aid in managing effectively the requirements for project successes. Here are some of the topics from last year and an expected list of topics
Requirements Management in Agile Projects
Requirements Management for Mobile Development
Managing requirements in developing Safety Critical Systems
Developing and managing requirements specifications for contract agreement
Requirements Driven Development: Understanding requirements and work items
Requirements engineering and supporting layered requirements and models
Delivering a specification perfect requirements set (document generation)
Requirements Reuse: Methods and best practice
Requirements management for complex systems and teams
Using traceability to expose gaps/change to other requirements and across the lifecycle
Requirements engineering for projects with complex systems and software
Requirements definition and management case studies
Requirements definition and management across the software lifecycle
Elicitation techniques for requirements and use cases
Agile software development and requirements modeling
Requirements management for outsourced projects
Defining and managing requirements across geographically distributed teams
Metrics and analysis used in requirements management
Integrating requirements with project and portfolio management
Implications of regulatory compliance on the requirements management process
Business specification-centric approaches
Best practices in aligning business goals and IT
Value-based requirements engineering
Business modeling in requirements definition
Requirements prioritization best practices and choosing your methodology
Incorporating industry standards as reusable requirements
Effective reporting using requirements and CLM information
DOORS, Requirements Composer and other Rational products best practices
Requirements engineering and product lifecycle management
Some session topics from Innovate 2012
Iterative Requirements Analysis: Implementing Lean and Agile principles for Software Requirements Analysis (Nationwide Case Study)
Visual definition in the requirements lifecycle: a conceptual framework
How IBM Rational DOORS Helps JPL Get to Mars and Beyond: Best Practices in Metrics, Verification and Traceability
Integrating IBM Rational DOORS with IBM Rational Team Concert – Lessons Learned at Raytheon
Integrating Requirements and Models with IBM Rational DOORS and IBM Rational Rhapsody: Lessons Learned at Lockheed Martin MS2
Writing Verifiable Requirements Is Not Easy
Share your experience, thoughts and best practices on requirements at an event attended by industry experts and IBM core development teams. Here are the top three reasons on why you should submit your paper for Innovate 2013.
Explore new areas - Free conference pass opening up the doors to 450+ sessions, labs and demo booths
Network with experts and peers - Over 4000 professionals expected to attend the event
Sharpen your technical know-how - Learn from product and domain experts and from IBM core developers
As we reach the close of a year and move into a new one,
it's often time to take stock of what we've been doing and making plans for the
New Year. We look at what we've been doing right and what we could change and
improve. So since this is a requirements management blog I thought it would
worth posing the question and giving an opinion on whether requirements
management (in the domain of systems and product development - my focus area)
is still relevant today and as we move into 2013?
I'm not really sure when requirements management as a formally
recognized discipline can be said to have come into being, but I do believe that
it really started to take shape in the early 90's, primarily based on work
coming out of the aerospace industry, and that's when commercial specialized
tools for requirements management, such as IBM Rational DOORS (then known
simply as DOORS from a company called QSS), first emerged. In 2005, I was leading
a team working on a campaign to promote the value of requirements management to
a wider audience than a core set of requirements specialists. We declared
2005 as the 'Year of Requirements Management' because of its increased
recognition as a discipline and the emergence of greater
tool capabilities for making requirements more easily accessible to a wider set
So as we move towards 2013, is requirements management still
as relevant? Do we still have further to go on becoming more effective at it?
In a recent Aberdeen Group report ‘Managing Systems Design Complexity: 3 Tips to Save Time’ by Michelle Boucher, where a survey of the effectiveness of systems engineering
capabilities of system and product development organizations is reported, two
of the three key recommendations made are directly related to requirements
management in the areas of visual requirements definition and requirements
traceability. In the other recommendation on improving change management across
engineering disciplines, Michelle says that impact analysis is core to such
improvement and that’s enabled by requirements management and traceability. From
study, a clear link can be shown between more effective requirements management
and traceability to business benefits such as reduced cycle times, improved
quality and increased product revenues. I also recently heard from another
analyst that one of the key challenges they are hearing from product
development organizations is getting a better handle on interrelationships
between requirements across engineering disciplines, so they can respond more
effectively to changes.
So my answer to the question I posed of is requirements
management still relevant is a resounding YES! We’ve made significant progress
but complexity of the systems we build has also increased and we need to keep
pace with changes in practices and technologies, so I expect effective
requirements management to remain a cornerstone of successful product
development and for practices and supporting tooling to continue to evolve.
But what you do think? Will requirements management be as
important in the future? How will it/should it change?
Last week the UK chapter of INCOSE (International Council on Systems Engineering) held their annual systems engineering conference on the Warwick University campus. I'd like to share some of what I heard during the conference, both on systems engineering in general, and more specifically on requirements management practices in the systems engineering domain.
One of the keynote speakers was Dr Sandy Wilson, President & Managing Director, General Dynamics UK. Dr Wilson spoke about the key challenges in the defense industry - the rate of change in threats and technology and the need to lower costs. He challenged the V model - said it's a nice diagram but its linearity is an issue - the world is not linear or rigid but the SE V diagram is. He spoke about the need for the defense industry to become more agile but that today change is cumbersome due to contractual issues and governance constraints. There are two main types of defense procurement done in the UK - the longer term needs are met by EPs (Equipment Programmes) and the urgent tactical needs by UORs (Urgent Operational Requirements). The former is bogged down in top level scrutiny and check boxes. The latter is helped by the top level sense of urgency and support. An example of a UOR was the decision to implement the multinational no-fly zone over Libya. Dr Wilson proposed that all defense projects should become more like UORs - more agile. He said that "an 80% solution delivered 1 year earlier is better than 90% delivered 4 years late". I heard that delivering incremental capability needs asset management and tracking, configuration management and a more agile approach to systems engineering - valuing "Product over Process". As well as changes in the way companies deliver capabilities, a change is needed in the way the customer (governments) do their acquisition and contracts in order to enable more agility.
Dr Jeremy Dick of Integrate Systems Engineering and co-author of the book 'Requirements Engineering' presented a case study in the aerospace industry on developing the assurance case for a (safety) critical system in parallel with requirements analysis, design, verification & validation, using an extension of his technique for documenting the rationale for traceability relationships known as 'rich traceability'. In addition to developing a requirements 'flow-down' (through levels of requirements to design), the 'evidence' supporting the flow-down is documented. The evidence in the early stages can be how you expect the lower level requirements or design elements to satisfy the higher level and your evidence to suggest that your argument is sound. In parallel your verification & validation strategies should be evolved, including an argument and supporting evidence for how the test(s) will prove the requirement(s) is/are met. Jeremy was asked how the textual requirements, arguments and evidence would fit with a MBSE (Model-Based Systems Engineering) approach. Jeremy answered that he favours (and in fact came up with the concept of - ref: "The Systems Engineering Sandwich: Combining Requirements, Models and Design", Jeremy Dick, Jonathon Chard, INCOSE International Symposium, Toulouse, July 2004) the sandwich model - interleaved layers of requirements and modeling used to decompose a system specification adn design (you can read more on that concept in the post 'Food for thought: The Systems Engineering Club Sandwich').
Chris Rolison, CEO, Comply Serve, continued the theme of progressive assurance with focus on the rail industry. Chris highlighted the complexity challenges in major rail infrastructure projects, and the issues presented by paper-based systems, silos in organization structures, and the supply chain. Chris said that "up to 80% of the engineering requirements can change during design & build" - not because the customer changes their mind but because of all the external factors involved in building a rail system. Chris went onto describe a more collaborative, requirements-driven design approach where systems engineering principles are applied, supported by a collaborative platform (ComplyPro which is based on IBM Rational DOORS).
Alastair Mavin of Rolls Royce 'lent' us his EARS (Easy Approach to Requirements Syntax (link is to an IEEE publication - sign in required) an application of a template with an underlying rule set on how to describe requirements using natural language but in a more structured, consistent way. He described the latest version of the template EARS+ (or as he nicknamed it 'Big EARS' !) and the benefits of the approach - simplicity and structure combined.
I could go on for pages about all of the great content shared at this excellent event but I'll leave it there with the main requirements related topics, except to quote from the keynote speaker on day 2: "The core of Systems Engineering is defining requirements and delivering against them". I'd put it this way - you can't have successful systems engineering without effective requirements management.
Neal Middlemore has over 14 years of experience in requirements management, this encompasses the associated disciplines of change management of requirements and validation/verification. Neal comes from an avionics systems engineering background and has been working with the DOORS product for over 10 years.
One of the most fundamental benefits that businesses want to get from using requirements management tools is consistent traceability. It doesn’t matter if it is an IT system being developed or an aircraft carrier, the levels of complexity being dealt with determine that traceability across multiple levels of requirements, from stakeholder requests to detailed implementation, is not simple to maintain manually.
Further hurdles are put in our way by the need to comply with legislative requirements, so many different industries these days have requirements placed upon them by government and international standards bodies along with internal corporate standards.
To prove compliance with these legislative rules it is necessary for projects to not only prove that requirements are being managed but also to provide the how and where of their management. What features relate which stakeholder request? What aspects of the solution satisfy safety regulations? Has the realization of the requirement been tested and by whom on what platform?
To answer these questions it is necessary to utilize the traceability capabilities provided by modern tools but it’s not enough to let a tool decide how things relate to each other, it isn’t enough to let individual users decide these relationships. Traceability needs to be considered in the larger context of how you report on it to answer the very questions that led you to consider traceability for in the first place. It’s more than ‘does object A relate to object B?’.
An initiative to define an information architecture of your governance solution will assist in defining your process artifacts and their relationships to one another. Traceability becomes an asset rather than an overhead. A good way to begin such an initiative is having a workshop attended by all project stakeholders i.e. those who have a vested interest in ensuring project success. Most likely these attendees will have a good understanding of the process and what the project needs to deliver. Undoubtedly each will also bring differing perspectives and understanding of what information is required. The test manager may want to see traceability from individual test artifacts through to the requirements being tested or to determine regulatory compliance has been achieved and whether verification methods have been agreed.
A subject matter expert (SME) may want to see the design in the context of the system level requirements and how it relates to stakeholder requirements. Everybody wants to see something that is applicable to their job roles, even wishing to see things outside of their typical discipline domains. By asking the questions and documenting the answers you can start to put together an information architecture that makes sense for your project. It’s likely that many information architectures will exist. Not every project is the same and these will have differing sets of required attributes, views and reports of project information and agreed structure of inter artifact relationships.
Modern tools can often create templates for this kind of information to allow the deployment of additional artifact containers as and when required. These can even enforce traceability to ensure the integrity across the project. Above all, it is vital that the needs of the project is documented and communicated alongside of the information architecture to all project members.
Last week I was really fortunate to spend a couple of days in London presenting to and talking to clients, business partners and industry analysts. It's always so good to hear what's really going on out there and to get many different perspectives on what's important today and for the future. The first day was at IBM's Innovate UK 2012 event where I was fortunate to be asked to present on all the really exciting new stuff we've done in the past year to help organizations building today's and the next generation of smarter products and systems, with particular focus on providing solutions for systems engineers and embedded software developers. You can catch the absolute latest news on our recent launch webpages. That session included a whistle-stop tour of the developments in requirements management for complex systems with Rational DOORS 9.4 and our plans for DOORS Next Generation. Whistle-stop because we also had so much news to get through in architecture & design, planning, change & configuration management and quality management, as well as industry specific solutions for A&D, automotive, medical devices and electronic design. And because on the following day at IBM's Southbank facility we had a whole day dedicated to topics related to DOORS.
At the DOORS customer day we had attendees from across several industry sectors including transportation, aerospace & defense, banking & mail services. The day kicked off with Morgan Brown presenting the latest on IBM's requirements management and DOORS strategy. Morgan told us how the DOORS 9.x series is and will continue to be developed and enhanced to meet the needs of the large install base, in parallel with the introduction of DOORS Next Generation (DOORS NG). DOORS NG is planned to take the best paradigms for managing structured requirements from DOORS 9.x and marry those with the requirements management and team collaboration capabilities that have been developed on the Jazz collaborative lifecycle management platform over the last 4 or so years (and are in use in the form of Rational Requirements Composer). The development of DOORS NG is out in the open on jazz.net where milestone builds can be downloaded, discussions held, defects/enhancements raised and plans viewed. DOORS NG has gone through four beta releases and is expected to be released in late November. Morgan explained that in its first release, DOORS NG is not intended to replace the DOORS 9.x product line, but it is expected that existing DOORS customers will try out DOORS NG on pilot and new projects, and will use the interoperability capabilities of the ReqIF data exchange and cross-tool traceability linking to exchange and/or link data between DOORS 9.x and DOORS NG. DOORS NG will also appeal to those looking for a requirements management tool that is on an integrated platform with design, test management and task/change management capabilities. Morgan reminded the audience of an IBM statement released earlier this year that existing DOORS customers with active support & subscription would be entitled to use both DOORS 9 and next generation capabilities when they become available. This was well received by the attendees since it means that they can try out DOORS NG when it ships without the need for an additional purchase.
Of course a day of technology insights never goes past without some piece of tech throwing an unexpected spanner in the works. This time it was the projector and the next presenter's Apple Mac that refused to talk to each other, so instead of a flow into a demo of DOORS NG, next up was Neal Middlemore to tell us about the improved integration of requirements and quality management with DOORS 9.4 and Rational Quality Manager (RQM) 4.0. This release was a significant enhancement that brings the integration in line with IBM's strategy to support OSLC - Open Services for Lifecycle Collaboration. OSLC is a new approach to tool integration that is open and vendor neutral. What's really different about OSLC is that data no longer needs to be copied or synchronized between tools in order to create cross-tool or cross-discipline visibility or relationships. So now quality professionals working in RQM can see requirements in DOORS and create links between test cases (and now because some organizations require it, test steps) and the requirements they are validating; and requirements professionals in DOORS can see linked test case information including test results, without the need for either to leave the comfort of their familiar tool or for data to be copied between the two tools. Neal demonstrated the value of the integration to requirements & quality professionals and showed how RQM can be used to manage manual testing or hook up with a number of IBM and partner solutions for various forms of test automation. You can also see a demo of the DOORS - RQM integration on YouTube.
So, technical issue solved, it was back to Jon Walton to give a demo of DOORS Next Generation using the Beta 4 release. Jon spent most of his time in the web client, highlighting the support for key DOORS paradigms such as hierarchical structured requirements documents, and showed off the plethora of new capabilities provided by the Jazz platform such as database-wide requirement reuse, graphical traceability view, requirements definition techniques (use case diagrams, storyboards), cross-discipline dashboards (containing requirements project info mashed up with info from design, quality and task management) and task management. Jon also showed the desktop client of DOORS NG which is very familiar looking to existing DOORS users with some twists (reuse of requirements across documents for one) - the desktop client will primarily be for users who need to do extensive editing of large requirements documents. If you're currently using DOORS 9.x, this YouTube video gives a quick preview intro of DOORS NG and how it's both similar to and different to DOORS 9.x. Watch this space for more to come on DOORS NG later this month.
Back to the earlier lifecycle integration theme started by Neal, next to present was Steve Rooks on how to use DOORS with IBM's solution for model-based systems engineering and model-driven embedded software development, Rational Rhapsody, to link requirements and design activities. Rational Rhapsody enables elaboration of requirements and construction of systems and software architectures using SysML and UML. Rhapsody Design Managerprovides an additional level of design collaboration capabilities. Models can be published to and/or stored and managed in a central repository, making them more easily accessible to a wider set of stakeholders so that designs can be better communicated and understood by all those involved in specifying, designing, building and validating a product or system. Rhapsody Design Manager uses OSLC to facilitate linking of design elements to other lifecycle artifacts - requirements, test cases, work items, etc. Like with the DOORS-RQM scenario described above, a systems engineer or software architect working in Rhapsody can see requirements in DOORS and easily create links between requirements and design model elements. Requirements and requirement links can even be included in model diagrams. And of course a DOORS user can see links to design elements without leaving DOORS or to participate in design reviews can navigate into Rhapsody Design Manager. You can read more about linking requirements and design and the DOORS-Rhapsody Design Manager integration in my recent post 'The House That Paul Built' that talks about a recent webcast on the topic.
After lunch, an IBM business partner Kovair was invited to present on how their Kovair Omnibus solution provides bridges, synchronization and workflow support across multiple tools from multiple vendors. It's a common situation to find yourself trying to enact processes and workflows when you have a diverse set of tools. Kovair talked about their support for OSLC to be able to widen the number of tools they can help link together, but also highlighted scenarios where you would still want to copy or transform data between tools - it's not a choice of Link or Sync, it's Link and Sync as appropriate.
The next session was presented by Paul Fechtelkotter, market manager for energy & utilities at IBM Rational. Paul gave a really interesting presentation on the challenges of complex systems development for nuclear power plants and how the nuclear industry is now adopting systems engineering best practices starting with requirements management to enable them to get better change management, traceability, impact analysis and compliance support. You can learn more about how IBM Rational is helping the nuclear industry on our dedicated web page.
Unfortunately I had to leave after Paul's session and didn't catch the remainder of the afternoon, but as you can see it was a day packed full of information. I hope you find my summary and links for more information useful. If you have any questions or comments on any of the topics I've covered here or indeed anything on IBM's requirements management strategy, Rational DOORS and the lifecycle integrations, please don't be afraid to ask by using the blog comments facility.
Eric has worked in the software development industry for over 20 years and is co-author of UML for Database Design and UML for Mere Mortals both published by Addison Wesley. Eric is currently responsible for capabilities marketing of Rational’s application lifecycle management solutions including Agile Software Delivery, Quality & Test Management, Requirements Management and Collaborative Lifecycle Management. He rejoined IBM in 2008 as the team leader for InfoSphere Optim Solutions and later was responsible for Information Governance Solutions. Prior to rejoining IBM, he worked for Ivar Jacobson Consulting as VP of Sales and Marketing. Before joining Ivar Jacobson, he was director of product marketing for CAST Software. Previously working for IBM, Eric held several roles within the Rational Software group including program director for business, industry and technical solutions, product manager for Rational Rose and team market manager for Rational Desktop Product. He also spent several years with Logic Works Inc. (Acquired by Platinum Technologies and CA), as product manager for ERwin.
As I think about IT today, there comes a rebirth in some ways of the importance of architecture and requirements. We are in an era of “ANY” -- meaning that applications and data can be accessed from anywhere, by anyone, and at any time.
Looking back at the applications of yesteryear (two or three years ago), we didn’t expect much from the web or mobile-based applications. We could view, run some reports or do some basic tasks, but to do the real work, we needed to go to the fat-client. Now, in today’s era of any, the user interface may look different, but the capabilities had better be the same since we expect near full capabilities no matter our device or interface.
This puts a new found set of requirements on applications and their development, and is making modeling and requirements (analysis and design) relevant again, but with a new twist – AGILITY. It is no longer a question of “what platform am I developing for” – the question is how quickly can we get it up and running on the latest version of Apple, Android, HTML 5 and whatever other platforms our clients expect the application to run on … and it had better run on all of the latest versions, with no delays, when updated operation systems come out.
And the question that I often receive now, however, is “can I be agile and meet these needs at the same time”? The plain answer is, yes, you can. However, agility doesn’t me you cannot ignore requirements and design. I am not talking about write-once, run-anywhere, rather instead understand the true requirements so that the various development teams can articulate them in code brought to life as features for the users, as they expect to see them. Users are looking for the application to be specific to their hardware/OS (iPad/AppleOS, Droid/Android…) as the hardware has become the platform for not just running the application, but the expected look, feel and usability of it now, too. This often means different developers for different deployment platforms, certainly at the User Interface level.
Designing applications requires that we are prepared. Architectures must be solidified and communicated. Requirements must be consistent and shared. We must model architectures so that developers can build to the designs and not recreate their own, wasting time and resources, and we must share those designs across the team.
Does this get in the way of agility? NO, it will speed agility. By sharing designs, assigning tasks based on architecture needs, we can speed time to market and our ability to deliver high quality software. In the era of any, we may have multiple teams working on the same front-end capabilities for different platforms even though the back-end is the same. But the more they can share, the faster they can be deployed and having the right requirements from users, the more satisfied they will be. We see people changing their desired platform as employers, vendors and suppliers change requirements, so we need to be prepared for the customer who is using an iPad today to be using an Android device tomorrow with the same requirements on the application. Just look at how the world of Blackberry has evolved.
So, as you think about your next project, don’t skimp on requirements and architectures or you may be limiting your agility in the future rather than speeding your time to satisfied clients.
Nowadays, software is present everywhere and software projects are becoming complex in terms of scope, time and cost. Associated with such a change increases the potential failure rate of software projects. How can these potential failures be avoided? While a guarantee may not be possible, adequate investments in managing the risk of failure can be provided. A typical textbook definition of software risk management is the identification of risks, analysis of identified risks and establishing plans to address those risks. The important goal of risk management is to avoid the occurrence of such risks. Similar to requirements management, risk management needs to be started early in the development life cycle process.
ISO/IEC 16085:2006 defines risk as a combination of the probability of an event and its consequence. What are the major sources of risks in a software project? An obvious answer to that question today would be the prevailing uncertainty added by time and budget pressures. Inaccurate requirements capture, is another important reason for increased risks in the later stages of the life cycle. Boehm has done some phenomenal work in managing risks in software projects. He essentially identifies ten risk aspects – Personal shortfalls, unrealistic schedules & budgets, development risks (building wrong functions, properties or user interfaces), adding unnecessary features, continuing requirements changes, shortfalls (in externally furnished components & performed tasks), performance shortfalls and technological strains.
So how do you best manage the risks? – Boehm divides the first level of activities into Assessment and Control. Assessment essentially contains identifying the risks, analyzing the identified risks and finally prioritizing the risks. Control aspect deals with planning, resolution of identified risks and monitoring. If you consider the Top 10 items he has identified, requirements mismatches, requirements changes and architecture performance & quality are among the top. Various techniques are discussed in Risk Management literature which is beyond the scope of this blog post. These techniques involve basic ones like maintaining a risk register to decision tree analysis , to risk exposure profiling. Murray Cantor, a Distinguished Engineer at IBM regularly writes about risks in his blog here.
What are some of the generic strategies to managing risks? – The predominant method is to buy more information; for example if you are in the early development cycle, you could always try prototyping to make sure you and your client are on the same page of understanding. This also helps in revealing the possible root causes of risks. Other options are to avoid the risk by de-scoping requirements, transferring it (for example outsourcing the component to an expert vendor or a sub-contractor), have mitigation plans or as the last option, accept the risk and have a Plan B. ISO 31000:2009, a relatively new standard introduced in 2009 related to risk management, provides a generic framework for a risk management process which a team can consider implementing.
How can tools help manage the risks? Risk includes both opportunities and threats - that is a risk can have both a positive and negative effect. Tools help in implementing an integrated risk process that enables maximization of value creation resulting in faster time to markets and improved productivity, at the same time avoiding the threats of cost and time over run and project closures. Tools can help significantly in two ways - conducting the qualitative and quantities risk analysis activities and actually implementing the outcomes for managing risks. Check this case study of Chubb Insurance that manages effectively its risk using IBM Rational Focal Point. And finally here is a developerWorks article on how to calculate your return on investment for software and systems.
You've bought the plot of land for your dream home. You have your list of requirements - 4 bedrooms, 3 bathrooms, spacious kitchen, 2 living rooms, 2 garages, landscaped gardens, etc. Would you be happy to simply hand that list to the builders and let them start work? Unlikely, I think. Typically, you call in an architect, who can take your quantitative requirements and qualitative desires and produce a blueprint, the architectural design that incorporates your wishes where feasible and adds creative flourish based on the architect's knowledge of house design. The blueprint enables you and the builders to have a much clearer picture of the desired end result than that original list of requirements. And it affords you the opportunity to influence the architecture, and for the builders to question and look at feasibility & cost options, before the foundations are dug and the first bricks are laid.
The same applies in product development. Systems engineers who are responsible for the holistic product specification and design don't just use textual requirements lists to capture the problem domain and describe the proposed solution. They analyze the requirements, identifying integrated scenarios, and often depict those using modeling languages such as UML or SysML. These modeled scenarios are easier to discuss and review with all stakeholders, and as the systems engineer evolves the proposed architecture (also in the same modeling language) they can run the scenarios against the architecture in model simulations to find inconsistencies or gaps in the requirements and flaws in the design, long before any software is coded, circuit boards are soldered or metal is welded.
So what value are our textual requirements lists - should we throw them away in favor of models? Well, not everything can be expressed in the model and not everyone involved in a development effort maybe using models. Going back to the house building analogy, there are contracts, numerous standards and regulations to be adhered to, and simply details that would make the blueprints unreadable. The various contractors (and I know from recent experience that sub-contracting is the name of the game in house building these days!) involved in the building process need to ensure that they can meet the contractual and regulatory demands while delivering against the architecture. Again this is the same in product development, except in many cases, particularly safety-critical systems, traceability and demonstration of conformance to requirements and compliance to standards & regulations are demanded. This requires the ability to integrate requirements and modeling workflows, easily link requirements and design elements, and to report on that linked information.
The need and solutions for this capability are nothing new. Integrations between requirements management and modeling tools have existed for many years (I think I started using such an integration in the early 90's and I'm sure they preceded that time). But I know from first hand experience of using and indeed writing such integrations that they've not always been optimal in the way integration is performed and in the workflow that is supported. Typically it's meant synchronizing (i.e. copying) data between tools in order to create the traceability links in one of the tools. This brings up all sorts of issues like 'which tool is the master?', 'am I looking at the latest data?', 'what happens when information is deleted?', etc.
With Open Services for Lifecycle Collaboration (OSLC) we now have a much better way to link data across product development and operations tools, even when the tools maybe from different vendors, open source or in-house. OSLC has learnt from the principles of the World Wide Web and enables
tool data to be shared and linked where it resides (called a ‘Linked
Data’ approach). OSLC provides a common vocabulary for ‘resources’ in
particular domains, i.e. what a requirement, test case, design element,
change request, work item, etc. looks like, so that regardless of tool,
technology or vendor, tools implementing OSLC specifications can share and link data.
With Rational DOORS 9.4 and Rational Rhapsody 8.0 with Design Manager 4.0, IBM is utilizing OSLC to provide a simplified workflow for linking requirements analysis and design. On September 20, Paul Urban (if you've been wondering about this blog post title, now you know the Paul I'm speaking of), Market Manager for IBM Rational Rhapsody, presented this simplified workflow and its benefits on a IEEE Spectrum webcast sponsored by IBM. You can watch and listen to the replay at your own leisure here. I hope you it enjoy it - please let Paul and I know what you think by leaving feedback on this blog post.
The importance of communication and collaboration in developing and managing good requirements were discussed in our earlier post on How to enable effective requirements communication and collaboration. In this guest blog post, Melissa Robinson - a Senior Technical Specialist at IBM writes about how Rational DOORS addresses this aspects with Discussions.
Melissa started her career at Telelogic enabling Product Management with technical support around requirements management. Melissa spent 3 years supporting clients getting started with Requirements management at Telelogic. After IBM acquired Telelogic in 2008, Melissa transitioned roles to support clients with Enteprise Architecture initiatives. She received the Carnegie Mellon certification in Enterprise Architecture in 2008 and is TOGAF certified. Melissa now supports clients getting started with evaluating and implementing both requirements management and enterprise architecture solutions.
Note: Please click on the screenshots for a better view
Why did we make this decision? Who made this decision? Who approved this requirement?
These are some of the questions we can help answer with effective collaboration messaging with DOORS. Collaboration messaging is now enabled in DOORS and DOORS Web Access (DWA) with the addition of DOORS Discussions. Discussions allow users to contribute and add comments to requirement objects or requirement modules, users can even add comments to base-lined requirements. Discussions offer a method of having a conversation on requirements. DOORS discussions really break the communication barrier by allowing users to easily make comments or start a discussion on any requirement, including read-only requirements. Discussions can be created in DOORS or DWA and viewed in both DOORS and DWA. Both DWA Editor and DWA Reviewer roles can contribute to Discussions. Discussions capture comments so that you can later review ancillary information about your requirements. Discussions allow everyone to contribute comments and provide a full understanding of requirements.
Here is a simple scenario for using DOORS Discussions. A DWA Reviewer user creates a Discussion on a requirement. A DOORS user then reviews this comment and contributes a comment on the requirement. The DWA user reviews the latest comment and closes the Discussion.
A DOORS user, Susan, reviews the current Discussion created by a DWA Reader user, Kavita. Susan can open the requirements module with a pre-created Discussion view to review the Discussions. Below Susan reviews the Discussion on Requirement AMR-STK-66.
Susan can contribute a comment to the open Discussion.
Kavita reviews the new Discussion comment in DWA. Notice that Kavita is a Reviewer in DWA. As a Reviewer, she can create and add comments to Discussions. Kavita can also close Discussions that she started. Later, Kavita can contribute another comment to the open Discussion.
As the person who first opened the Discussion, Kavita can close this Discussion.Later, in DOORS, Susan can review the latest status of the Discussion using the Discussion Thread view. As a Database Manager role, Susan can choose to re-open the closed Discussion at any time.
Discussions open up the communication thread between several different types of DOORS users. Discussions allow requirements reviewers to exchange views and comments about the content of a requirements module or the content of a requirement object in a module.
We believe the post gave you a sneak preview of how DOORS Discussions help in effectively collaborating and communicating between various stakeholders during requirements management. Feel free to contact melissarobinson[at]us.ibm.com if you have any queries about the topic. Melissa will be discussing the topic in detail in an upcoming webcast on October 5, 2012. Don't miss the opportunity to watch the action live. Register now @ http://bit.ly/DOORS_Discussions
In this post Jim Hays writes about the various options available in testing how the requirements are met in IBM Rational DOORS.
Jim works as a Senior Systems Engineer at IBM. Jim started his career in 1982 working for software providers. Jim’s career history is an interesting one; he hasn’t worked for many software companies over the course of his career. He worked for Applied Data Research (7 years) that eventually got bought by Computer Associates. He then moved to Goal Systems, that got bought by Legent (6 years). Legent, then got bought by Computer Associates. He then spent almost 10 years at Sterling Commerce that just got bought by IBM. After Sterling Commerce he joined Telelogic where he got into the ALM market that eventually got purchased by IBM Rational (7 years).
I’ve been involved with DOORS for over 7 years, and absolutely love the tool. My job at IBM is to technically support our sales team, and our customers, not only for DOORS but many other solutions we offer. I have had a lot of experience working with our DOORS customers understanding how they use DOORS, and even though DOORS is a requirements management solution there are other types of information being put into DOORS other than just requirements. An example of that is the fact that customers will put in test data into a DOORS module, and enable easy linking between the requirements and their related validation/verification results.
Note: Please click on the screenshots for a better view
Provided below is an example showing that. In this DOORS View we see traceability between 4 modules:
User Requirements>Functional Requirements>Functional Test Plan>Functional Test Cases
DOORS has had in it for years a capability called the Test Tracking Toolkit. This enables one to capture in a DOORS module test results by duplicating attributes based upon creating a new test run to store test run results for each run uniquely. This over time will create a lot of attributes in order to capture and store test run results per run. Both of these described usages of capturing tests and test results enable quite easy linking between the DOORS requirements and their related tests and test results. Below are the options available utilizing the Test Tracking toolkit.
(Read in clockwise from top left)
So what are the positives and negatives of both of these usages of DOORS modules to capture test, validation, and/or verification information. The positive is the ability to store and easily link requirements to testing results using standard DOORS linking. If requirements change that are linked to test-based modules, then standard DOORS “suspect links” would notify the folks maintaining the test-based DOORS modules of that requirement change to see if they need to update their test plan and or have to retest the test case. The other question is who is maintaining/updating the DOORS test-based modules? Are the actual testers going into DOORS to update the test results? Are the testers using a spreadsheet to capture test results and giving that to a DOORS user to update in DOORS? It is my opinion that either one of these scenarios discussed are fine for projects that only do manual testing, and don’t have a lot of testing (i.e. test runs) to perform. The other issue is that the actual testing can be occurring on different environments. For example if one built an application that is web-based and will run on different operating systems, then one would need to test all of those types of configurations . If one is building an embedded software device or a software application, then one might want to not just do manual testing, but instead automate the execution and capturing results via automated testing.
In my opinion I think a solution like DOORS for requirements management is great for that; whereas, I believe the folks that are in charge of Quality and/or performing the task of testing should have a solution that is suited for the role they play in a project-Test Management. So the final option for testing I will discuss is how DOORS (managing requirements) can integrate to IBM’s Test Management solution called Rational Quality Manager (RQM). RQM can provide a nice environment for the support of both manual and automated testing.
Provided below are an example “dashboard” that users can configure based upon what they would like to see and an example of a RQM Test Plan.
Provided below is an example of how one used a DOORS View that would be a view of requirements from DOORS that are known by this particular RQM project. Requirements driven testing enable requirements from DOORS to be used to automatically generate Test Cases and build specific links between the DOORS requirements to specific test cases.Screenshot provided in the right show the results of that link creation automation by showing traceability between test cases to DOORS requirements, and also could show development software assets.
The integration between DOORS and RQM is utilizing the OSLC (Open Services for Lifecycle Collaboration). Below is showing a “rich hover” ability to see details about linked items without actually having to navigate the link. One can also see the results of the test execution (pass or fail)
As the testers are doing their work then the requirements from DOORS can map data from RQM into DOORS-based attributes. Below is an example showing the traceability between DOORS requirements and the testing side of things in DOORS. I can see that the latest test case run passed.Provided below are screenshots showing coverage analysis relating the DOORS requirements to the Test Plan and associated test case.
Finally, provided below screenshot shows the test case execution results that were performed via RQM. These are mapped to DOORS attributes via the bi-directional integration and regular DOORS sorting and filtering can be used. For example if I wanted to see what Test Cases failed and or passed.
Hope the blog post was useful. Feel free to contact Jim @ haysji[at]us.ibm.com if you have any queries regarding the options for testing in DOORS.
Also, Jim will be hosting a webinar on the same topic in which he will go in depth the ideas presented in this post about the options related to testing in conjunction to using DOORS. Register for the session here - http://bit.ly/DOORS-Testing_Options
There is no doubt that the evolution of computing has moved into the era of the mobile device and any business that wants to remain competitive in an increasingly difficult economic environment needs to embrace this with the care and attention it deserves.
With over a million devices being activated every single day it is predicted that by 2013 mobile phones will overtake the PC as the most common web access device worldwide!
Companies that simply try and tweak their existing web sites to run in mobile browsers are missing a trick as users expect very sleek interfaces that make use of their devices capabilities such as Geo-location and camera. With this in mind the perceived quality of the application comes from both its functionality and perhaps more importantly the design. The design of the applications user interface is also critical when trying to improve brand loyalty and therefore businesses are keen to see their brand image extended to mobile devices where they can reach a much wider audience.
So where does Requirements Management fit in?
Well, a good requirements management process that is incorporated into the wider development life cycle will allow the business to communicate exactly what is required of the mobile application; enabling the development team to be clear on what needs to be created, and for testers to begin writing test cases earlier in the lifecycle. With the need for good user interface design it is important that requirements are not purely textual and so with tools like Rational Requirements Composer, Business Analysts can model Use Cases, Business Processes and also visualize the user interface through UI Sketches, Screenflows and Storyboards. This means the business can be very clear on what the expected result should be and remove any unnecessary ambiguity that only slows down development and ultimately prevents applications going to market quickly.
One significant trend in the development of mobile apps is the adoption of Agile methodologies such as Scrum, which fits in well with the short timescales and rapid change and release management that mobile apps have. This makes it even more important for the Requirements Management process to also be agile and allow greater collaboration not only within the Business Analysts team, but also with the other teams involved such as development and testing. A web based requirements management tool like RRC encourages wider engagement with stakeholders and also provides dashboards with live information, collaborative reviews and reporting to promote visibility and allow decisions to be made quickly and effectively.
Traceability is one of the corner stones of good Requirements Management, because without it you cannot determine if the resulting product has actually satisfied the original requirements outlined by the business. RRC is a part of the wider solution called Collaborative Life cycle Management and allows the requirements to be linked to the resulting development work items and associated test plans, test cases and defects. This means that from any given requirement you can see exactly what development task is going to implement it, which test case is going to test it and what defects were found in relation to that requirement.
In summary, mobile application development is an exciting and important part of any business plan and due to its inherent complexity you really need the right tools to make it a success.
I am sure you would have seen a graphic similar to this depicting the communication gap between stakeholders in a project and its consequences. Today projects are getting increasingly complex, teams involved in them are often distributed and delivery time is getting reduced. Faster to market has become the major contributing factor to success for most of companies. Being unable to finish development on time and budget and thus missing opportunities is a vexing problem for organizations.
Clearly articulating stakeholder business objectives and requirements for application and product development gives the much needed head start to optimize end results; however tackling the challenge of managing effective communication between development teams and providing a mutually supportive collaborative environment helps ensure a successful project completion. Studies conducted by IBM have shown an improvement of 15-35% in team productivity with the help of effective collaboration. There are two aspects to communication – how to engage stakeholders and how to manage internal team communication.
Managing stakeholder communication
It’s imperative to engage stakeholders early on to get the requirements right. If you are an agile project environment, having consistent involvement of stakeholders becomes even more important and challenging. Providing stakeholder access to the project environment with appropriate levels of access enables this. Thorough requirements definition practices involve understanding as many specifics as possible and should start at the very beginning of the project.
Managing inter-team and stakeholder communication
Unifying stakeholders and the project team, helps to ensure that project goals are met and averting the potential impact that being late to market can have on the bottom line. Up-front visual and textual requirements elicitation techniques to build stakeholder consensus, for example, coupled with full requirements traceability across the life cycle, helps cut risk and the cost of rework from unclear, ambiguous or changing requirements. In the end, this can help improve the time to value and quality. Ralph R Young (Effective Requirements Practices) identifies three root causes for requirements related issues – wide disparity between stated requirements by customers and the real requirements expected ineffective requirements practices in supply chain and finally the lack of joint customer/supplier responsibility for the project success.
While the personal communication tactics like brainstorm meetings and knowledge sharing sessions can add immense value, the present day globally distributed environment requires more day-to-day closely knit solutions for communication. The requirements tool used should ideally have the capability to address a wide set of requirements information beyond the requirements themselves: business context, aspirations, considerations, and business and technical constraints. Capabilities like shared repository, simultaneous view of what team members are working on open issues, group conversations about requirements, and online reviews & approvals can significantly improve communication in the team and increase productivity. Linking requirements artifacts to related information in a repository, and embedding artifacts into documents and user interface sketches (empowering for rapidly refining ideas) have significant advantages. Also, this broad and flexible approach enables teams anywhere in the world to collaborate, clarify and achieve consensus quickly about the requirements as they develop business driven solutions.
Clearly, with geographically distributed teams, teamwork has new dimensions. If an organization gets collaboration right, it can potentially drive higher levels of productivity—and innovation
In my career I’ve been deeply involved with both modeling and requirements management disciplines and tools, so it always intrigues me when I hear debates over whether largely textual based (sometimes referred to as ‘traditional’ or ‘document-based’) or model-based approaches to defining and managing requirements are the right way to go.
We’ve all heard the argument that a picture paints a thousand words, but I’ve always vividly remembered something I heard at a conference some years ago which was “I’d have taken a 1000 words over this one unreadable diagram.”
My belief is that it is not an either-or decision. You need both. Models can add clarity to requirements specifications and can bring together a more holistic understanding of what’s expressed in the requirements. Models can be walked through with stakeholders and with the right language and tools (like SysML or UML in IBM Rational Rhapsody), they can even be run to validate that what is captured in the model is correct, consistent and complete. But what if you have contractual requirements to manage, documents of regulations or standards to comply with, or complex performance or availability constraints – you don’t want to clutter your model with so much detail that it becomes unusable.
My preference is for a combination of textual requirements and models, that can be described by the ‘Systems Engineering Club Sandwich’ (references 1&2) where textual requirements, which form the layers of bread - and maybe a bit dry on their own, are supplemented by models that form the layers of filling – they are richer and more expressive, together forming a tasty combination to help explore and elaborate requirements, perform decomposition and allocation, and maintain traceability. I recently got together with my colleague Paul Urban to record a 30 minute webcast entitled ‘The Tasty Way to Tackle Complexity - The Systems Engineering Club Sandwich of Requirements & Models’ where we take a look at some engineering challenges, where requirements work goes wrong, how the club sandwich approach works and how to use requirements and models together effectively. So if this hors d'oeuvre has made you hungry for more, please take a look. Paul and I are really interested to hear what you think.
References: 1. "The Systems Engineering Sandwich: Combining Requirements, Models and
Design", Jeremy Dick, Jonathon Chard, INCOSE International Symposium,
Toulouse, July 2004. 2. Requirements Engineering, Hull, Jackson
& Dick, Springer 2004.
I was lucky enough last week to travel to the INCOSE
(International Council on Systems Engineering) International Symposium 2012 near Rome, Italy.
An excellent opportunity to meet the systems engineering community and hear
about their interests and concerns. We had lots of traffic to the very stylish
IBM booth where we talked about the IBM Rational solutions for systems
engineering and the latest from IBM Research on tool interoperability and
design optimization & trade-off. I’d like to claim the traffic was to due
to my presence but in fact there was lots of excitement and interest in the
must have giveaway of the conference, the IBM Limited Edition of Systems
Engineering for Dummies book
(if you weren’t there and don’t have a copy, you
can download a PDF version).
Being at the INCOSE event reminded me of the very active and
interesting discussion I recently provoked on the INCOSE LinkedIn group with the posting of the link to my
previous blog post ‘Traceability – How Much is Enough?’.
It’s a great read with some very provocative statements about whether
traceability is at all useful and that it’s the root cause of failure on
projects that overrun and overspend versus those that say it’s absolutely vital
on safety-critical systems or where the project is contract-driven. In the end I
think some consensus was reached between these two camps that ‘just enough’
traceability to keep a project on track, provide customer/market need context
to engineers, facilitate impact analysis, and (if needed) to meet industry
standards and regulations, is sufficient. Any more is excessive and wasteful
and likely to bog down progress towards to delivering innovative products and
During a quiet time at the IBM booth, I also had chance to
chat with my colleague Brian Nolan (marketing manager for aerospace &
defense industry at IBM Rational) about effective traceability, since Brian
is very interested in this topic and has presented on a Dr Dobbs
webcast on ‘3 Ways to Improve Traceability and Impact Analysis’.
Brian believes in what I would describe as ‘traceability by design’, meaning
that traceability is automatically established while you decompose your system
design (for example, use case to use case realization to sequence diagram and
so on). This discussion also reminded me of what another colleague Greg Gorman
(program director for IBM systems and software engineering solutions and the
INCOSE Corporate Advisory Board member from IBM) described several years ago as
‘link while you think’, meaning traceability is created by the tools, while you
are performing requirements decomposition, design and development, rather than
as an overhead activity afterwards.
I think we’ve now moved some way beyond ‘link while you
think’. While an information model with ‘just enough’ traceability for your
project needs is essential to avoid traceability spiraling out of control, with
new approaches such as Linked Lifecycle Data from the OSLC (Open Services for Lifecycle Collaboration) community, and tools that recognize implicit traceability, provide
new ways to visualize lifecycle traceability and perform effective impact
analysis, we can make traceability work for us to help engineering become more
agile, while staying within costs and schedule and produce innovative, higher
quality products and systems.
We had an interesting webinar with Mary Gorman, EBG Consulting on whether Business Analysis is required in agile projects? Mary talked about lots of concepts and put forth her case on how business analysis is essential for agile success. If you didn't attend the webinar, an on demand replay is available here. Also read our interview with Mary on this topic here.
Mary shared with us five agile business analysis actualities that are key to agile project successes. Two of them stuck my mind – delivering valued products and product partnerships. Often we tend to forget hard questions like to whom our products are most valuable and what is the potential of our products? Mary Gorman shared a simple value equation with us
Value = Increased Revenue + Avoid Cost + Improved Service + risk
If we are to provide value to our customers, we have to concentrate on each of the factors above. Increasing the revenue stream can be through either creating new streams or protecting the existing ones. Software tools can play a role in avoiding cost through increasing operational efficiency to improve time to market. Improving the service could mean increased usability and accuracy. Coupling these factors with risk and dependencies and looking at the balance is very critical for providing a valuable product.
The role of partnership and collaboration is increasingly becoming important and this was once again stressed by Mary with her opinion of analysis is everyone’s job and is a continuous process. At IBM we believe that a collaborative requirements definition & management capability that is linked with other critical lifecycle disciplines such as test management are essential for success in agile business analysis, particularly where you may have large, distributed teams and/or regulated environments.
The webcast also discussed other agile business realities like the importance of discovering product needs just in time, structured conversations and confirmations through examples, prototypes and documentation. Watch the complete webcast here.
A lot have been talked about CLM 2012 and Rational Requirements Composer 4.0 since their release in June 2012. RRC 4.0 is now available for download from jazz.net.I am sure, most of you have already downloaded and started playing around with it.In this post, I will try to briefly touch base on the enhancements we made in RRC 4.0 and also provide resource links to download, demos and other tutorials.
Essentially the improvements in RRC 4.0 have been around
Combined Definition and Management - A clear and centralized requirements management eliminate redundancy and enable real time development.In RRC 4.0 we have moved the sketching, diagramming capabilities to the Web client to allow creation of rich text, diagrams, visual use cases, storyboards, and rich text based requirements. Life cycle Solutions & Collaboration - Improved Word/Excel migration and improved collaboration capabilities
Improved Planning & Visibility - Enhanced visibility using traceability across requirements, test and development.Visual and textual scenarios are now supported.Through the customized dashboard users will be able to know what is under work and can quickly navigate to information that is being traced, discussed, or under work at the present time. Data can also be organized in dynamic analysis views to help answer project questions and reports. For RRC 4.0 you can now use the RTC planning capabilities to help priorities requirements before they are defined. This capability allows requirements to be directly aligned with priority and planning.
Support for ReqPro data migration has been improved significantly. Stay tuned for another post on this topic. Meanwhile I encourage you to watch some videos that explains this in detail. The major improvement in RRC 4.0 is with respect to the improvements we have made in traceability. We have further expanded both requirements and life cycle traceability capabilities. A graphical traceability analysis and usage tool has been included in the latest version that enables you to graphically analyze the requirements in detail with enhanced filtering options.Suspect link analysis has also been improved significantly. A suspicion profile can be set and it allows to select the requirement types that you want to assess.This enables to take care of false positives, and discover changes across trace links among requirements and other life cycle elements. Watch the demos provided below to see how this works.
Some other feature level enhancements in RRC 4.0 are
As we mentioned in an earlier blog post, DOORS 9.4 and DOORS Web Access (DWA) 1.5 were released during Innovate 2012.This blog post provides insight into what’s changed in this release of DOORS and some of the significant new features. I have also provided a few resources where you can learn more about this release.
DOORS –RQM Integration based on OSLC
The most significant changes in DOORS 9.4 are the improvements to OSLC based integrations. A new integration based on OSLC has been provided for Rational Quality Manager (RQM). Let's see how it is different from the existing (RQMI) integration based on a point-to-point solution. Provided below is a simplified representation of how RQMI works for the DOORS-RQM integration and contrasts it with the new OSLC based integration.
As you can clearly see, the integration has been made so simple in terms of software and storage, yet more powerful. The new integration provides a stable architecture for future enhancements and provides an automated migration. If we consider the installation and configuration aspects, the new integration no longer requires the server and java client components.
What does this mean to a typical DOORS user? - This enables real-time lifecycle traceability to RQM test cases. This can be achieved through either the hover over menu from DOORS that display RQM artifacts or directly from RQM
What does this mean to a typical RQM user? - The real-time integration enables the RQM user to review and edit the automatically created draft test cases (with the new requirement reconciliation wizard) based on new requirements, and trace them back to DOORS. Full test coverage when requirements are changing is enabled with features like
Automatic display of requirements not covered by test cases in current test plan
Provision for linking existing test cases to new requirements
Display of modified and removed requirements
Enhanced suspect-ability analysis
Another important improvement in this release is the enhanced traceability to meet regulatory requirements. This enables
Linking of one or more requirements to each test step of a manual test script
Managing the association of requirements to related test cases
Display of links during test execution and in test case results
Apart from this, integration to Design Manager RSA and Rhapsody beta based on OSLC is also available in DOORS 9.4. However Design Manager is still in beta and this integration will be available only during later part of this year. For more details, visit jazz.net. Also the data exchange mechanism has been upgraded to the latest version of OMG (Object Management Group) ReqIF (Requirements Interchange Format) from RIF. This helps in improved communication of requirements between organizations in a supply chain. Support for data exchange and linked data between DOORS 9.x and DOORS Next Generation is also included. Reporting across DOORS 9.x and DOORS Next Generation is also included.
There are going to be some changes in licensing when we consider using DOORS with Rational Publishing Engine (RPE). We have removed the requirement for a RPE license while using RPE custom templates from directly within DOORS. But you still need a license for creating new custom templates; however it is not required to drive the reports.
We will briefly look at the usability improvements that have gone into DOORS 9.4. Many of these usability improvements reduce the need of writing custom DXL scripts. In DOORS 9.4, we have provided a stronger support to define and manage how more than one user can work on a module simultaneously. It is controlled with a widget allowing you to set a sharing level for editing as shown below.
The Views now support color coding and the user is allowed to control the background color of attributes. The views have been also extended to 128 columns.
Another small, yet significant usability improvement is the possibility to remove multiple views in a single selection. And finally DOORS now supports rich text exporting to Microsoft Excel.
We had a series of announcements earlier this month during Innovate 2012. The major announcements pertaining to Requirements Management space were RRC 4.0, DOORS 9.4 and DOORS Next Generation Beta. This blog post aims to dive a little deeper into these and provide a few helpful links...so read on!
Introducing Rational Requirements Composer 4.0
Collaborative Lifecycle Management (CLM) 2012 was announced on June 12 at the jazz.net website and Rational Requirements Composer (RRC) 4.0 continues to be one of the major pillars. CLM 2012 aims to provide the best integration experience available today in the industry between requirements, source control and quality management. For a detailed post on CLM 12, see Robin Garside's post in jazz.net.
In order to better the existing requirements management solution, RRC 4.0 comes up with a whole new set of improvements and features. The new and improved capability in this release will help project teams realize project change impact with downstream visibility using graphical and suspect traceability across requirements, test, and development items. Information access can now be controlled at a granular level - giving you the peace of mind knowing the exact requirements information that need to be modified. Project template upload and download as well as Requirements Interchange Format (ReqIF) are now available to give administrators and team leads easier control for multiple project environments. Additionally, project teams have greater cross project visibility through project dashboards. Some other significant highlights of RRC 4.0 are:
Enhanced enterprise deployment and scalability to support high availability and availability via clustering
Extended and refined data access control and automatic requirements identification
More solutions for analyzing traceability through graphical explorers and suspect link change identification
Extended CLM lifecycle integration for Rational Design Manager (Beta) to trace and report requirements and models/elements
Watch out for another blog post detailing these improvements in RRC 4.0 soon…
Introducing Rational DOORS 9.4
Systems space was equally active this month with lots of announcements from IBM Rational. With the help Open Services for Lifecycle Collaboration (OSLC), continued efforts have been put in unifying lifecycle disciplines across systems and software engineering. DOORS 9.4 and DOORS Web Access 1.5 are the latest offerings in Requirements Management for Systems space. DOORS 9.4 boasts of significant enhancements in server side and new integrations with Rational Quality Manager, Rational Software Architect Design Manager beta and DOORS Next Generation beta using OSLC.
Significant usability enhancements have been made for both rich and web clients. Customer templates designed using Rational Document Studio are now supported by DOORS 9.4. An additional bonanza of producing documentation from DOORS without the need for an additional license of Rational Publishing Engine is now possible. For a detailed account on the improvements in DOORS 9.4; visit the announcement letter here.
Download IBM Rational DOORS 9.4 here. Learn more about Rational DOORS here.
Stay tuned for a detailed post on What’s new in DOORS 9.4…..
DOORS Next Generation Beta
IBM DOORS Next Generation aims to be the next generation requirements management solution for complex software and systems engineering environments. Today, we are releasing latest version, Beta 3 of DOORS Next Generation and is available for download from jazz.net. DOORS Next Generation is built on the learnings of DOORS and extends the technologies of Rational Requirements Composer and DOORS 9. If you are considering how different DOORS Next Generation is from DOORS, read the article written by Richard Watson, Senior Product Manager for RM tools at IBM comparing the two products in jazz.net.
With this beta version, data import/export between DOORS Next Generation and DOORS 9.x projects are possible. Also bi-directional linking between the two products is supported. Development of DOORS Next Generation is kept transparent following the jazz vision and all developments can be tracked at Rational DOORS Next Generation section in jazz.net.
Our intended direction for developing DOORS family is to allow a smooth transition between DOORS and DOORS Next Generation. For each DOORS license entitlement that has active Subscription and Support, a customer will be able to use either DOORS V9 or next-generation capabilities. A commercial version of DOORS Next Generation is expected in the last quarter of this year.
There have been interesting thoughts on how requirements and should requirements be managed in agile projects. Some believe that requirements management is meaningless; some believe they are still a critical factor irrespective of the methodologies one follow. Here we bring to you an interview with Mary Gorman, an Agile Business Analysis Expert at EBG Consulting where she argues that business analysis is essential for agile success.
Mary Gorman, CBAP, CSM, is VP of quality and delivery at EBG Consulting, whose experts help deliver high-value products that delight customers. Mary works with global clients, speaks at industry conferences, and writes on requirements topics for the business analysis community. In addition to serving on the IIBA® Business Analysis Body of Knowledge® Committee for four years, Mary helped create the first certification exam for the Certified Business Analysis Professional™ (CBAP®).
We've heard comments like "you can throw away all that requirements and analysis stuff now we're going Agile." Have you come across this and what's your view?
It's a common misconception. In fact, requirements drive agile teams! At EBG Consulting we find when agile teams collaboratively analyze requirements, they can speed development and delivery of high value products. The ability to be focused, nimble and disciplined about your requirements is essential for successful agile delivery.
So how does business analysis change as you adopt Agile practices?
You plan and analyze regularly to support a steady flow of product delivery, a hallmark of successful agile teams. On agile projects, we plan to re-plan. A plan represents your allocation of requirements—really options for satisfying product needs—to delivery cycles. Rather than trying to acquire all the possible requirements upfront at the start and create one big plan, you plan continually, using feedback from prior deliveries to adjust your plan. This in turn means you are continually analyzing requirements to discover high value options for the next delivery. Planning and analysis are interdependent and synergistic. See this article for more details.
Agile analysis is done just-in-time—you want your requirements to be “fresh”. You adjust the precision and granularity of requirements taking a just-enough approach. You make use of good analysis tools and techniques. For example, you might sketch a context diagram to quickly visualize the interfaces needed for a release, a minimum marketable feature, a use case or a story. Or organically explore requirements using a data model or state diagram. For more ideas to tune analysis for agile, visit Agile Analysis Challenges
What about the role of the business analyst in Agile?
In our forthcoming book, my co-author, Ellen Gottesdiener (EBG’s founder and president), and I write about a product partnership that collaborates to discover and deliver valued products. The partners include diverse perspectives from the business, customer and technology communities. We have found this partnership is critical for agile product success. (The book’s title is Discover to Deliver: Agile Product Planning and Analysis. Read more about it here.
Often business analysts ask us where they ‘fit’ in this partnership. Our response is to ask, “What are your skills?” An agile team needs strong skills in analysis, modeling, elicitation, facilitation, risk analysis, prioritization, strategic thinking, verification and validation along with a sound understanding of the product’s domain. The person who possesses a combination of such skills will be a valuable and valued team member.
Who should attend the webcast on How Business Analysis is Essential to Agile Success and what will they gain from it?
The webinar’s content has broad applicability. It may benefit someone involved with planning, analysis, valuation, validation and verification; teams and organizations transitioning to agile; product champions and product owners who have the responsibility for making decisions about what product options to deliver, anyone on an agile team who recognizes that user stories, user story maps, personas are often not enough to communicate product needs. The webinar provides ideas for holistically exploring and evaluating product options within a framework the agile team can use to reach a shared understanding of high value product needs.
Mary will be joining with us on Wednesday, June 27, 2012 for a webinar where she will expand on the below discussions to build a case that business analysis is your key for
Discovering product options
Collaborating to create Agile plans
Conversing daily about what to deliver
Adapting your product with each delivery cycle to respond to changes in business needs
A replay will also be posted in case you missed the webinar on Wednesday. We believe this interview gave an insight into why business analysis is critical to agile projects. For more interactive discussion join us in the webinar.
What you think about requirements in agile? What level of requirements management/analysis do you follow in your agile projects?
I'm writing from Innovate 2012 in Orlando, Florida where thousands are attending sessions and sharing thoughts about software development and systems engineering. One topic that keeps coming up is that of traceability. On Sunday at VoiCE (Voice of the Customer Event), we had some great discussions with clients in the industrial sector building complex and embedded systems such as planes, cars and medical devices about traceability scenarios they have. There was a lively discussion around how much traceability is enough. One client, who is working in aerospace, needs to comply with DO-178B, and requires traceability all the way from a high level customer requirement through to individual lines of code. Others asked 'do you really need that fine grained traceability?' and 'won't that be very difficult to manage?' Another described that they have 26 teams and 16 applications to manage, and in the past had many (I think I heard 50!) locations where requirements were stored, usually in spreadsheets, making traceability very difficult. Now with the 'right schema' in place and using IBM Rational Requirements Composer, they have a solution that makes traceability much easier, and an environment that is manageable for the long term as it scales. Having the right schema - the information model of artifacts and what relationships they have was stressed as a vital ingredient in any recipe for successful traceability.
In a breakout session yesterday, data was shared that on a deep space exploration mission project, there are over 80,000 items in the requirements database (IBM Rational DOORS) and over 40,000 links - mind blowing complexity of data and relationships, and that's on one of many projects they have running today.
The right culture, process and tools for your application/system/product/service, organization and industry are necessary to prevent traceability across not only requirements, but into designs, work items, tests and so on, spiraling into an uncontrollable, unusable spaghetti of artifacts and links.
So for you and your projects, how much traceability is enough, how are you managing it and what would you like to see in the future to make the creation, maintenance and most importantly utilization of traceability easier to do and more effective?
We have 16 sessions lined up in this track with focusing on software delivery that revolve around eliciting, defining, elaborating, understanding, organizing, reviewing, communicating and tracking business, user, and software requirements. We have multiple customer sessions; however I am sure you will find the below sessions interesting
Case Study: Moving from Organized Chaos to Standard Process and Tooling – Disney’s Experience in deploying IBM Rational Tools
DHL Aligning Business and IT with IBM Rational Requirements Composer
Iterative Requirements Analysis: Implementing Lean and Agile Principles for Software Requirements Analysis
Like previous editions of Innovate, this year also we have keynotes where IBM shares its road map, strategies and vision for requirements definition and management tools. Also there will be immense number of opportunities to meet senior product management professionals and developers of IBM Rational's Requirements Management tools in sessions like 'Ask the Experts: IBM Rational Requirements Composer and IBM Rational RequisitePro', What's New with IBM® Rational® Requirements Composer? and other presentations.
Some of the other notable presentations focus on trending topics and best practices in the requirements management domain like conceptual frameworks for visual definition in requirements life cycle, Requirements Engineering Maturity Model (REMM) and many more. There are multiple workshops about defining and managing requirements with IBM Rational tools, IBM Collaborative Lifecycle Management, best practices in using Requirements Composer and Jazz primers. Don't forget to try out Open Labs and solutions peds at Innovate!
And finally we have a competition (Who Is the Best IBM Rational Composer User? ) in which Rational Requirements Composer experts put their skills to the test to compete in a variety of different tool challenges and prove who is the best requirements tool champion.In this special event, participants have a chance to compete with IBM experts and tool developers to prove their expertise by solving common and difficult requirements management problems!
At Innovate 2012 in Orlando, June 3-7 there
will be two requirements management (RM) tracks – one focused on RM for IT
application development, and product-wise primarily on RequisitePro and
Rational Requirements Composer; and another focused on RM for systems
engineering (SE), and product-wise on DOORS. This blog post is focused on the
RM for SE track but look out for another on the IT focused track.
I think you’ll find that the
RM for SE track has some really strong content this year. Out of 16 sessions,
10 will feature customer speakers, including:
How IBM Rational DOORS Helps Jet Propulsion
Laboratory Get to Mars and Beyond: Best Practices in Metrics,
Verification, and Traceability
Using IBM Rational DOORS to Support Systems
Engineering and Release Management across Multiple Programs at Trane, a
heating, ventilation & air conditioning systems manufacturer
A CareFusion Case Study of Integrating IBM
Rational DOORS and HP Quality Center for Use in an FDA Environment
Integrating IBM Rational DOORS with IBM Rational
Team Concert:Lessons Learned at
You’ll also be able to meet
product management and senior development staff and ask them questions in our ‘Ask
The Experts (for DOORS version 7.x, 8.x and 9.x users)’, and you’ll hear about
IBM’s strategy and roadmap in ‘What's Now and Next in Requirements Management
for Systems Engineering’, including the latest release and plans for the DOORS
If you’ve been following our
RM developments recently you’ll be aware of the DOORS Next Generation project
on Jazz.net and you’ll hear about that during the Now and Next session, and
if you want to dive deeper into what’s planned be sure to go to the session
‘Deep Dive Investigation and Feedback about IBM Rational DOORS Next-Generation
Beta’ and visit the DOORS Next Generation demo pedestal in the Innovation Lab
area of the Solution Center.
And if you’ve ever attended
before, you’ll know that a popular feature is the DOORS DXL Script Exchange
competition, where you can demonstrate your prowess in DXL and win a small
prize. For more details about this year’s competition (with a twist!) please
Scripts are due on or before May 25th.
On top of all this fantastic
content (and this is just for one track – there are over 400 sessions across
the whole conference!), Innovate is a great opportunity to network with other
systems engineers and software developers, share war stories, tips and tricks
and maybe a drink or two.
Many articles, papers and books have talked about the impact of fixing a defect at later stages of development life cycle. Researches have also been proved that majority of the software defects found in a project are related to requirements and continues to be higher than that arising due to design or coding issues. According to studies conducted, approximately 60%-70% of IT project failures result from poor requirements gathering, analysis, and management*. The most successful products and applications have been built with a thorough understanding of what they are intended for. Understanding and managing requirements are by far the most important aspect in the development life cycle irrespective of which technology, development process, industry or purpose of the software or system under development. The beauty of an effective requirements management is that it provides complete insights into how the development of a system is progressing for every stakeholder in the team yet at a level of abstraction neutral to technology, platform or perspective. It thus becomes the corner stone of software or systems development!
In today’s world, developing smarter products at the lowest time to market has become a norm for success. Often the level of pressure that mounts to get started and come up with a prototype or the product itself is high and requirements are cornered with least importance. But if we check statistics, we can see many examples that have caused hefty prices and disasters due to hasty development or lack of thrust given to understand the real requirements. Read a compilation by Computer World UK here. These two videos from IBM and IAG Consulting respectively talks about why you should consider requirements seriously.
If you watched the above videos, it is clear that the process of how requirements, are managed and collaboration and communication between stakeholders are equally important for an effective requirements management. Also many mistakenly believe that requirements management is something that takes place during the definition stages of a project and is then complete. However the reality is that requirements exist in some form at virtually every stage of development.
The three main factors to be considered while defining the requirements management are
Requirements definition. The requirements should not only be concise and unambiguous, they should also be testable.
Requirements classification. To manage project development effectively, each requirement statement must be classified appropriately for the application to aid in effective decision making and resource allocation.
Requirements traceability. Stakeholders of a project must be able to track requirements through the stages of development .
Stay tuned for more posts on requirements management, analysis and other topics...
Have some suggestions/opinions/comments/topics...send us!
* Source: The Meta Group market research firm, a division of Gartner, March 2003
We have been discussing about requirements management practices and solutions from IBM Rational for Requirements Definition and Management at Rational RDM Blog. We decided to move to our new own home here in developerworks and will continue our discussions going forward here.
Make sure you change your bookmarks and follow this blog to to keep abreast of requirements management principles, advances in the domain and solutions from IBM Rational. We hope to continue having a fruitful discussion here and create a mutual learning platform for all of us! We will continue to discuss the contemporary world of requirements management from both software and systems perspectives.
Continue reading the blog for more details and articles...If you have any suggestions for topics, provide your comments here.