- Requirements definition. The requirements should not only be concise and unambiguous, they should also be testable.
- Requirements classification. To manage project development effectively, each requirement statement must be classified appropriately for the application to aid in effective decision making and resource allocation.
- Requirements traceability. Stakeholders of a project must be able to track requirements through the stages of development .
Requirements Management Blog
VijaySankar 270000E5JQ Tags:  requirements_management traceability refresher requirements 4,306 Views
Does the above graph ring a bell?
Many articles, papers and books have talked about the impact of fixing a defect at later stages of development life cycle. Researches have also been proved that majority of the software defects found in a project are related to requirements and continues to be higher than that arising due to design or coding issues. According to studies conducted, approximately 60%-70% of IT project failures result from poor requirements gathering, analysis, and management*. The most successful products and applications have been built with a thorough understanding of what they are intended for. Understanding and managing requirements are by far the most important aspect in the development life cycle irrespective of which technology, development process, industry or purpose of the software or system under development. The beauty of an effective requirements management is that it provides complete insights into how the development of a system is progressing for every stakeholder in the team yet at a level of abstraction neutral to technology, platform or perspective. It thus becomes the corner stone of software or systems development!
In today’s world, developing smarter products at the lowest time to market has become a norm for success. Often the level of pressure that mounts to get started and come up with a prototype or the product itself is high and requirements are cornered with least importance. But if we check statistics, we can see many examples that have caused hefty prices and disasters due to hasty development or lack of thrust given to understand the real requirements. Read a compilation by Computer World UK here. These two videos from IBM and IAG Consulting respectively talks about why you should consider requirements seriously.
If you watched the above videos, it is clear that the process of how requirements, are managed and collaboration and communication between stakeholders are equally important for an effective requirements management. Also many mistakenly believe that requirements management is something that takes place during the definition stages of a project and is then complete. However the reality is that requirements exist in some form at virtually every stage of development.
The three main factors to be considered while defining the requirements management are
Stay tuned for more posts on requirements management, analysis and other topics...
Have some suggestions/opinions/comments/topics...send us!
* Source: The Meta Group market research firm, a division of Gartner, March 2003
AndyGurd 270001QKDH Tags:  requirements-management innovate traceability rational-doors rational-requirements-com... rational ibm 4 Comments 9,238 Views
I'm writing from Innovate 2012 in Orlando, Florida where thousands are attending sessions and sharing thoughts about software development and systems engineering. One topic that keeps coming up is that of traceability. On Sunday at VoiCE (Voice of the Customer Event), we had some great discussions with clients in the industrial sector building complex and embedded systems such as planes, cars and medical devices about traceability scenarios they have. There was a lively discussion around how much traceability is enough. One client, who is working in aerospace, needs to comply with DO-178B, and requires traceability all the way from a high level customer requirement through to individual lines of code. Others asked 'do you really need that fine grained traceability?' and 'won't that be very difficult to manage?' Another described that they have 26 teams and 16 applications to manage, and in the past had many (I think I heard 50!) locations where requirements were stored, usually in spreadsheets, making traceability very difficult. Now with the 'right schema' in place and using IBM Rational Requirements Composer, they have a solution that makes traceability much easier, and an environment that is manageable for the long term as it scales. Having the right schema - the information model of artifacts and what relationships they have was stressed as a vital ingredient in any recipe for successful traceability.
In a breakout session yesterday, data was shared that on a deep space exploration mission project, there are over 80,000 items in the requirements database (IBM Rational DOORS) and over 40,000 links - mind blowing complexity of data and relationships, and that's on one of many projects they have running today.
The right culture, process and tools for your application/system/product/service, organization and industry are necessary to prevent traceability across not only requirements, but into designs, work items, tests and so on, spiraling into an uncontrollable, unusable spaghetti of artifacts and links.
So for you and your projects, how much traceability is enough, how are you managing it and what would you like to see in the future to make the creation, maintenance and most importantly utilization of traceability easier to do and more effective?
AndyGurd 270001QKDH Tags:  systems-engineering ibm requirements systems linked-lifecycle-data traceability ibmrational linked-data requirements-management incose oslc rational 8,467 Views
I was lucky enough last week to travel to the INCOSE (International Council on Systems Engineering) International Symposium 2012 near Rome, Italy. An excellent opportunity to meet the systems engineering community and hear about their interests and concerns. We had lots of traffic to the very stylish IBM booth where we talked about the IBM Rational solutions for systems engineering and the latest from IBM Research on tool interoperability and design optimization & trade-off. I’d like to claim the traffic was to due to my presence but in fact there was lots of excitement and interest in the must have giveaway of the conference, the IBM Limited Edition of Systems Engineering for Dummies book
(if you weren’t there and don’t have a copy, you can download a PDF version).
Being at the INCOSE event reminded me of the very active and interesting discussion I recently provoked on the INCOSE LinkedIn group with the posting of the link to my previous blog post ‘Traceability – How Much is Enough?’. It’s a great read with some very provocative statements about whether traceability is at all useful and that it’s the root cause of failure on projects that overrun and overspend versus those that say it’s absolutely vital on safety-critical systems or where the project is contract-driven. In the end I think some consensus was reached between these two camps that ‘just enough’ traceability to keep a project on track, provide customer/market need context to engineers, facilitate impact analysis, and (if needed) to meet industry standards and regulations, is sufficient. Any more is excessive and wasteful and likely to bog down progress towards to delivering innovative products and systems.
During a quiet time at the IBM booth, I also had chance to chat with my colleague Brian Nolan (marketing manager for aerospace & defense industry at IBM Rational) about effective traceability, since Brian is very interested in this topic and has presented on a Dr Dobbs webcast on ‘3 Ways to Improve Traceability and Impact Analysis’. Brian believes in what I would describe as ‘traceability by design’, meaning that traceability is automatically established while you decompose your system design (for example, use case to use case realization to sequence diagram and so on). This discussion also reminded me of what another colleague Greg Gorman (program director for IBM systems and software engineering solutions and the INCOSE Corporate Advisory Board member from IBM) described several years ago as ‘link while you think’, meaning traceability is created by the tools, while you are performing requirements decomposition, design and development, rather than as an overhead activity afterwards.
I think we’ve now moved some way beyond ‘link while you think’. While an information model with ‘just enough’ traceability for your project needs is essential to avoid traceability spiraling out of control, with new approaches such as Linked Lifecycle Data from the OSLC (Open Services for Lifecycle Collaboration) community, and tools that recognize implicit traceability, provide new ways to visualize lifecycle traceability and perform effective impact analysis, we can make traceability work for us to help engineering become more agile, while staying within costs and schedule and produce innovative, higher quality products and systems.
VijaySankar 270000E5JQ Tags:  traceability collaboration guest requirements requirements-management doors 2 Comments 13,541 Views
The importance of communication and collaboration in developing and managing good requirements were discussed in our earlier post on How to enable effective requirements communication and collaboration. In this guest blog post, Melissa Robinson - a Senior Technical Specialist at IBM writes about how Rational DOORS addresses this aspects with Discussions.
Melissa started her career at Telelogic enabling Product Management with technical support around requirements management. Melissa spent 3 years supporting clients getting started with Requirements management at Telelogic. After IBM acquired Telelogic in 2008, Melissa transitioned roles to support clients with Enteprise Architecture initiatives. She received the Carnegie Mellon certification in Enterprise Architecture in 2008 and is TOGAF certified. Melissa now supports clients getting started with evaluating and implementing both requirements management and enterprise architecture solutions.
Note: Please click on the screenshots for a better view
Why did we make this decision? Who made this decision? Who approved this requirement?
These are some of the questions we can help answer with effective collaboration messaging with DOORS. Collaboration messaging is now enabled in DOORS and DOORS Web Access (DWA) with the addition of DOORS Discussions. Discussions allow users to contribute and add comments to requirement objects or requirement modules, users can even add comments to base-lined requirements. Discussions offer a method of having a conversation on requirements. DOORS discussions really break the communication barrier by allowing users to easily make comments or start a discussion on any requirement, including read-only requirements. Discussions can be created in DOORS or DWA and viewed in both DOORS and DWA. Both DWA Editor and DWA Reviewer roles can contribute to Discussions. Discussions capture comments so that you can later review ancillary information about your requirements. Discussions allow everyone to contribute comments and provide a full understanding of requirements.
Here is a simple scenario for using DOORS Discussions. A DWA Reviewer user creates a Discussion on a requirement. A DOORS user then reviews this comment and contributes a comment on the requirement. The DWA user reviews the latest comment and closes the Discussion.
A DOORS user, Susan, reviews the current Discussion created by a DWA Reader user, Kavita. Susan can open the requirements module with a pre-created Discussion view to review the Discussions. Below Susan reviews the Discussion on Requirement AMR-STK-66.
Susan can contribute a comment to the open Discussion.
Kavita reviews the new Discussion comment in DWA. Notice that Kavita is a Reviewer in DWA. As a Reviewer, she can create and add comments to Discussions. Kavita can also close Discussions that she started. Later, Kavita can contribute another comment to the open Discussion.
As the person who first opened the Discussion, Kavita can close this Discussion.Later, in DOORS, Susan can review the latest status of the Discussion using the Discussion Thread view. As a Database Manager role, Susan can choose to re-open the closed Discussion at any time.
Discussions open up the communication thread between several different types of DOORS users. Discussions allow requirements reviewers to exchange views and comments about the content of a requirements module or the content of a requirement object in a module.
We believe the post gave you a sneak preview of how DOORS Discussions help in effectively collaborating and communicating between various stakeholders during requirements management. Feel free to contact melissarobinson[at]us.ibm.com if you have any queries about the topic. Melissa will be discussing the topic in detail in an upcoming webcast on October 5, 2012. Don't miss the opportunity to watch the action live.
Register now @ http://bit.ly/DOORS_Discussions
Date: Oct 5, 2012
Time: 12 Noon EDT
AndyGurd 270001QKDH Tags:  modeling rhapsody-design-manager embedded-software uml collaboration sysml model-based-systems-engin... oslc jazz rational systems-engineering design systems requirements-management rational-doors rational-rhapsody traceability 13,452 Views
You've bought the plot of land for your dream home. You have your list of requirements - 4 bedrooms, 3 bathrooms, spacious kitchen, 2 living rooms, 2 garages, landscaped gardens, etc. Would you be happy to simply hand that list to the builders and let them start work? Unlikely, I think. Typically, you call in an architect, who can take your quantitative requirements and qualitative desires and produce a blueprint, the architectural design that incorporates your wishes where feasible and adds creative flourish based on the architect's knowledge of house design. The blueprint enables you and the builders to have a much clearer picture of the desired end result than that original list of requirements. And it affords you the opportunity to influence the architecture, and for the builders to question and look at feasibility & cost options, before the foundations are dug and the first bricks are laid.
The same applies in product development. Systems engineers who are responsible for the holistic product specification and design don't just use textual requirements lists to capture the problem domain and describe the proposed solution. They analyze the requirements, identifying integrated scenarios, and often depict those using modeling languages such as UML or SysML. These modeled scenarios are easier to discuss and review with all stakeholders, and as the systems engineer evolves the proposed architecture (also in the same modeling language) they can run the scenarios against the architecture in model simulations to find inconsistencies or gaps in the requirements and flaws in the design, long before any software is coded, circuit boards are soldered or metal is welded.
So what value are our textual requirements lists - should we throw them away in favor of models? Well, not everything can be expressed in the model and not everyone involved in a development effort maybe using models. Going back to the house building analogy, there are contracts, numerous standards and regulations to be adhered to, and simply details that would make the blueprints unreadable. The various contractors (and I know from recent experience that sub-contracting is the name of the game in house building these days!) involved in the building process need to ensure that they can meet the contractual and regulatory demands while delivering against the architecture. Again this is the same in product development, except in many cases, particularly safety-critical systems, traceability and demonstration of conformance to requirements and compliance to standards & regulations are demanded. This requires the ability to integrate requirements and modeling workflows, easily link requirements and design elements, and to report on that linked information.
The need and solutions for this capability are nothing new. Integrations between requirements management and modeling tools have existed for many years (I think I started using such an integration in the early 90's and I'm sure they preceded that time). But I know from first hand experience of using and indeed writing such integrations that they've not always been optimal in the way integration is performed and in the workflow that is supported. Typically it's meant synchronizing (i.e. copying) data between tools in order to create the traceability links in one of the tools. This brings up all sorts of issues like 'which tool is the master?', 'am I looking at the latest data?', 'what happens when information is deleted?', etc.
With Open Services for Lifecycle Collaboration (OSLC) we now have a much better way to link data across product development and operations tools, even when the tools maybe from different vendors, open source or in-house. OSLC has learnt from the principles of the World Wide Web and enables tool data to be shared and linked where it resides (called a ‘Linked Data’ approach). OSLC provides a common vocabulary for ‘resources’ in particular domains, i.e. what a requirement, test case, design element, change request, work item, etc. looks like, so that regardless of tool, technology or vendor, tools implementing OSLC specifications can share and link data.
With Rational DOORS 9.4 and Rational Rhapsody 8.0 with Design Manager 4.0, IBM is utilizing OSLC to provide a simplified workflow for linking requirements analysis and design. On September 20, Paul Urban (if you've been wondering about this blog post title, now you know the Paul I'm speaking of), Market Manager for IBM Rational Rhapsody, presented this simplified workflow and its benefits on a IEEE Spectrum webcast sponsored by IBM. You can watch and listen to the replay at your own leisure here. I hope you it enjoy it - please let Paul and I know what you think by leaving feedback on this blog post.
Opening DOORS in London: What's new and coming up for integrated requirements management for complex systems from IBM
AndyGurd 270001QKDH Tags:  integration systems rational-doors open-services-for-lifecyc... doors-rqm rational-rhapsody rational-quality-manager alm ibm-innovate kovair oslc doors-next-generation rational-design-manager nuclear-industry traceability 5,789 Views
Last week I was really fortunate to spend a couple of days in London presenting to and talking to clients, business partners and industry analysts. It's always so good to hear what's really going on out there and to get many different perspectives on what's important today and for the future. The first day was at IBM's Innovate UK 2012 event where I was fortunate to be asked to present on all the really exciting new stuff we've done in the past year to help organizations building today's and the next generation of smarter products and systems, with particular focus on providing solutions for systems engineers and embedded software developers. You can catch the absolute latest news on our recent launch webpages. That session included a whistle-stop tour of the developments in requirements management for complex systems with Rational DOORS 9.4 and our plans for DOORS Next Generation. Whistle-stop because we also had so much news to get through in architecture & design, planning, change & configuration management and quality management, as well as industry specific solutions for A&D, automotive, medical devices and electronic design. And because on the following day at IBM's Southbank facility we had a whole day dedicated to topics related to DOORS.
At the DOORS customer day we had attendees from across several industry sectors including transportation, aerospace & defense, banking & mail services. The day kicked off with Morgan Brown presenting the latest on IBM's requirements management and DOORS strategy. Morgan told us how the DOORS 9.x series is and will continue to be developed and enhanced to meet the needs of the large install base, in parallel with the introduction of DOORS Next Generation (DOORS NG). DOORS NG is planned to take the best paradigms for managing structured requirements from DOORS 9.x and marry those with the requirements management and team collaboration capabilities that have been developed on the Jazz collaborative lifecycle management platform over the last 4 or so years (and are in use in the form of Rational Requirements Composer). The development of DOORS NG is out in the open on jazz.net where milestone builds can be downloaded, discussions held, defects/enhancements raised and plans viewed. DOORS NG has gone through four beta releases and is expected to be released in late November. Morgan explained that in its first release, DOORS NG is not intended to replace the DOORS 9.x product line, but it is expected that existing DOORS customers will try out DOORS NG on pilot and new projects, and will use the interoperability capabilities of the ReqIF data exchange and cross-tool traceability linking to exchange and/or link data between DOORS 9.x and DOORS NG. DOORS NG will also appeal to those looking for a requirements management tool that is on an integrated platform with design, test management and task/change management capabilities. Morgan reminded the audience of an IBM statement released earlier this year that existing DOORS customers with active support & subscription would be entitled to use both DOORS 9 and next generation capabilities when they become available. This was well received by the attendees since it means that they can try out DOORS NG when it ships without the need for an additional purchase.
Of course a day of technology insights never goes past without some piece of tech throwing an unexpected spanner in the works. This time it was the projector and the next presenter's Apple Mac that refused to talk to each other, so instead of a flow into a demo of DOORS NG, next up was Neal Middlemore to tell us about the improved integration of requirements and quality management with DOORS 9.4 and Rational Quality Manager (RQM) 4.0. This release was a significant enhancement that brings the integration in line with IBM's strategy to support OSLC - Open Services for Lifecycle Collaboration. OSLC is a new approach to tool integration that is open and vendor neutral. What's really different about OSLC is that data no longer needs to be copied or synchronized between tools in order to create cross-tool or cross-discipline visibility or relationships. So now quality professionals working in RQM can see requirements in DOORS and create links between test cases (and now because some organizations require it, test steps) and the requirements they are validating; and requirements professionals in DOORS can see linked test case information including test results, without the need for either to leave the comfort of their familiar tool or for data to be copied between the two tools. Neal demonstrated the value of the integration to requirements & quality professionals and showed how RQM can be used to manage manual testing or hook up with a number of IBM and partner solutions for various forms of test automation. You can also see a demo of the DOORS - RQM integration on YouTube.
So, technical issue solved, it was back to Jon Walton to give a demo of DOORS Next Generation using the Beta 4 release. Jon spent most of his time in the web client, highlighting the support for key DOORS paradigms such as hierarchical structured requirements documents, and showed off the plethora of new capabilities provided by the Jazz platform such as database-wide requirement reuse, graphical traceability view, requirements definition techniques (use case diagrams, storyboards), cross-discipline dashboards (containing requirements project info mashed up with info from design, quality and task management) and task management. Jon also showed the desktop client of DOORS NG which is very familiar looking to existing DOORS users with some twists (reuse of requirements across documents for one) - the desktop client will primarily be for users who need to do extensive editing of large requirements documents. If you're currently using DOORS 9.x, this YouTube video gives a quick preview intro of DOORS NG and how it's both similar to and different to DOORS 9.x. Watch this space for more to come on DOORS NG later this month.
Back to the earlier lifecycle integration theme started by Neal, next to present was Steve Rooks on how to use DOORS with IBM's solution for model-based systems engineering and model-driven embedded software development, Rational Rhapsody, to link requirements and design activities. Rational Rhapsody enables elaboration of requirements and construction of systems and software architectures using SysML and UML. Rhapsody Design Manager provides an additional level of design collaboration capabilities. Models can be published to and/or stored and managed in a central repository, making them more easily accessible to a wider set of stakeholders so that designs can be better communicated and understood by all those involved in specifying, designing, building and validating a product or system. Rhapsody Design Manager uses OSLC to facilitate linking of design elements to other lifecycle artifacts - requirements, test cases, work items, etc. Like with the DOORS-RQM scenario described above, a systems engineer or software architect working in Rhapsody can see requirements in DOORS and easily create links between requirements and design model elements. Requirements and requirement links can even be included in model diagrams. And of course a DOORS user can see links to design elements without leaving DOORS or to participate in design reviews can navigate into Rhapsody Design Manager. You can read more about linking requirements and design and the DOORS-Rhapsody Design Manager integration in my recent post 'The House That Paul Built' that talks about a recent webcast on the topic.
After lunch, an IBM business partner Kovair was invited to present on how their Kovair Omnibus solution provides bridges, synchronization and workflow support across multiple tools from multiple vendors. It's a common situation to find yourself trying to enact processes and workflows when you have a diverse set of tools. Kovair talked about their support for OSLC to be able to widen the number of tools they can help link together, but also highlighted scenarios where you would still want to copy or transform data between tools - it's not a choice of Link or Sync, it's Link and Sync as appropriate.
The next session was presented by Paul Fechtelkotter, market manager for energy & utilities at IBM Rational. Paul gave a really interesting presentation on the challenges of complex systems development for nuclear power plants and how the nuclear industry is now adopting systems engineering best practices starting with requirements management to enable them to get better change management, traceability, impact analysis and compliance support. You can learn more about how IBM Rational is helping the nuclear industry on our dedicated web page.
Unfortunately I had to leave after Paul's session and didn't catch the remainder of the afternoon, but as you can see it was a day packed full of information. I hope you find my summary and links for more information useful. If you have any questions or comments on any of the topics I've covered here or indeed anything on IBM's requirements management strategy, Rational DOORS and the lifecycle integrations, please don't be afraid to ask by using the blog comments facility.
VijaySankar 270000E5JQ Tags:  requirements rational-doors guest traceability rational requirements-management information-architecture 9,840 Views
Neal Middlemore has over 14 years of experience in requirements management, this encompasses the associated disciplines of change management of requirements and validation/verification. Neal comes from an avionics systems engineering background and has been working with the DOORS product for over 10 years.
One of the most fundamental benefits that businesses want to get from using requirements management tools is consistent traceability. It doesn’t matter if it is an IT system being developed or an aircraft carrier, the levels of complexity being dealt with determine that traceability across multiple levels of requirements, from stakeholder requests to detailed implementation, is not simple to maintain manually.
Further hurdles are put in our way by the need to comply with legislative requirements, so many different industries these days have requirements placed upon them by government and international standards bodies along with internal corporate standards.
To prove compliance with these legislative rules it is necessary for projects to not only prove that requirements are being managed but also to provide the how and where of their management. What features relate which stakeholder request? What aspects of the solution satisfy safety regulations? Has the realization of the requirement been tested and by whom on what platform?
To answer these questions it is necessary to utilize the traceability capabilities provided by modern tools but it’s not enough to let a tool decide how things relate to each other, it isn’t enough to let individual users decide these relationships. Traceability needs to be considered in the larger context of how you report on it to answer the very questions that led you to consider traceability for in the first place. It’s more than ‘does object A relate to object B?’.
An initiative to define an information architecture of your governance solution will assist in defining your process artifacts and their relationships to one another. Traceability becomes an asset rather than an overhead. A good way to begin such an initiative is having a workshop attended by all project stakeholders i.e. those who have a vested interest in ensuring project success. Most likely these attendees will have a good understanding of the process and what the project needs to deliver. Undoubtedly each will also bring differing perspectives and understanding of what information is required.
The test manager may want to see traceability from individual test artifacts through to the requirements being tested or to determine regulatory compliance has been achieved and whether verification methods have been agreed.
A subject matter expert (SME) may want to see the design in the context of the system level requirements and how it relates to stakeholder requirements.
Everybody wants to see something that is applicable to their job roles, even wishing to see things outside of their typical discipline domains. By asking the questions and documenting the answers you can start to put together an information architecture that makes sense for your project. It’s likely that many information architectures will exist. Not every project is the same and these will have differing sets of required attributes, views and reports of project information and agreed structure of inter artifact relationships.
Modern tools can often create templates for this kind of information to allow the deployment of additional artifact containers as and when required. These can even enforce traceability to ensure the integrity across the project. Above all, it is vital that the needs of the project is documented and communicated alongside of the information architecture to all project members.