Requirements Management Blog
AndyGurd 270001QKDH Tags:  requirements-management innovate traceability rational-doors rational-requirements-com... rational ibm 4 Comments 9,259 Views
I'm writing from Innovate 2012 in Orlando, Florida where thousands are attending sessions and sharing thoughts about software development and systems engineering. One topic that keeps coming up is that of traceability. On Sunday at VoiCE (Voice of the Customer Event), we had some great discussions with clients in the industrial sector building complex and embedded systems such as planes, cars and medical devices about traceability scenarios they have. There was a lively discussion around how much traceability is enough. One client, who is working in aerospace, needs to comply with DO-178B, and requires traceability all the way from a high level customer requirement through to individual lines of code. Others asked 'do you really need that fine grained traceability?' and 'won't that be very difficult to manage?' Another described that they have 26 teams and 16 applications to manage, and in the past had many (I think I heard 50!) locations where requirements were stored, usually in spreadsheets, making traceability very difficult. Now with the 'right schema' in place and using IBM Rational Requirements Composer, they have a solution that makes traceability much easier, and an environment that is manageable for the long term as it scales. Having the right schema - the information model of artifacts and what relationships they have was stressed as a vital ingredient in any recipe for successful traceability.
In a breakout session yesterday, data was shared that on a deep space exploration mission project, there are over 80,000 items in the requirements database (IBM Rational DOORS) and over 40,000 links - mind blowing complexity of data and relationships, and that's on one of many projects they have running today.
The right culture, process and tools for your application/system/product/service, organization and industry are necessary to prevent traceability across not only requirements, but into designs, work items, tests and so on, spiraling into an uncontrollable, unusable spaghetti of artifacts and links.
So for you and your projects, how much traceability is enough, how are you managing it and what would you like to see in the future to make the creation, maintenance and most importantly utilization of traceability easier to do and more effective?
Saurabh.Tyagi 270005CY2K Tags:  rm doors next solutions management requirements rational generation ibm 5,356 Views
IBM Rational DOORS family solutions offer best practices in requirements management and traceability, saving organizations time and money, through improved collaboration with stakeholders to eliminate inaccurate, incomplete, and omitted requirements.
Doors Next Generation is a requirements management application for optimizing requirements communication, collaboration and verification throughout your organization and supply chain. This scalable solution can help you meet business goals by managing project scope and cost. Rational DOORS lets you capture, trace, analyze and manage changes to information while maintaining compliance to regulations and standards.
Top three reasons your organization needs Doors Next Generation
Key Feature's :
Centralized location : Requirements management in a centralized location improves team collaboration, provides access to full editing, configuration, analysis and reporting capabilities through a desktop client. It also supports the Requirements Interchange Format, enabling suppliers and development partners to contribute requirements documents, sections or attributes that can be traced back to central requirements. Records and displays requirements text, graphics, tables, requirements attributes, change bars, traceablity links and more.
Link requirements : Traceability by linking requirements to design items, test plans, test cases and other requirements. Users can concurrently edit separate product and system requirement documents and link entries between documents. Requirement entries can also be linked to models, text specifications, code files, test procedures and documents created with other applications
Scalability : Address changing requirements management needs. Offers an explorer-like hierarchy with multiple levels of folders and projects for simple navigation no matter how large the database grows.
Change management : Integrations to help manage changes to requirements with either a simple, pre-defined change proposal system or a more thorough, customizable change control workflow. It integrates with Rational change management software for requirements change control and for requirements workflow management. It also integrates with other rational solutions, including IBM Rational Quality Manager, IBM Rational Rhapsody, IBM Rational Focal Point and others like HP Quality-Center for visibility of requirements to create test cases for traceability and for status report on coverage of requirements by test cases, and also with Microsoft Team Foundation Server (TFS) to enable Microsoft Visual Studio development teams to create and maintain traceability between requirements in Rational DOORS and TFS Work Items in Visual Studio.
If your organization wants to replace outdated and expensive legacy tools, and needs better control over multiple versions of documents, IBM appreciates the opportunity to discuss REQUIREMENTS with you. If you’d like to see a live demo please click dngdemo,
Would you like to start benefiting by using IBM Rational DOORS Next Generation? Start your free trial today Free Trial
VijaySankar 270000E5JQ Tags:  requirements-management rational ibm doors ibmrational 9,000 Views
We had a fantastic DOORS customer webcast panel on September 26 with experts from across the industries talking about their experience in using IBM Rational DOORS for requirements management. If we have to take one success metric for the webcast; it was that we overshoot by half an hour from the designated one hour for the discussion because we had some wonderful discussions going on. Our panelists have graciously agreed to respond to the questions on an offline basis, and we are publishing the answers here.
If you missed this golden opportunity and is wondering whether you can get another chance? YES! Or you would like to listen to it again; we have posted the replay here. You can listen to it anytime now. Register here
We’re software only, so no HW/SW interfaces. However, I strongly support the use of interface specifications between software systems and have used an interface module to sit in between systems that need to interact. The trick is to stay on the requirements side of the requirements/design line. The cost of maintenance on such a model is high – probably too heavy for our agile process to maintain. However, should we need to support integration with an external application (one that our teams are not developing), our API requirements will undergo close scrutiny and possibly be carved out into a separate module so that we can use trace links to do impact analysis on external “clients.”
VijaySankar 270000E5JQ Tags:  rational ibm doors doors-next-generation innovate2013 ibmrational 6,380 Views
Here is coverage of Requirements Management for Systems Engineering track keynote based on the presentation that Bill Shaw (Systems Program Director) and Richard Watson (Senior Product Manager for Requirements Management tools) delivered at Innovate 2013 today.
For those new to the space, IBM Rational DOORS is a widely recognized product in the requirements management area and here is how we see our products are meant for -
Rational DOORS the trusted, de-facto standard requirements management tool for employing Systematic Engineering methodologies to build complex and embedded systems.
Rational DOORS Web Access, an add-on for DOORS enables globally distributed stakeholders with visibility into requirements and traceability relationships (managed in Rational DOORS), with the ability to communicate via online requirements discussions. Using a Web browser, DOORS Web Access provides access to view and discuss requirements—with no additional software installed on your desktop.
And finally the latest addition to the family, Rational DOORS Next Generation is the next generation requirements management solution built on the IBM Rational Jazz platform.
With such a plethora of offerings, we believe we have the right requirements solution for you. As we mentioned earlier, introduction of DOORS Next Generation DOES NOT mean we are moving away from DOORS. We are continuing our investment in DOORS and we will continue to release better and improved versions of DOORS in future. We believe DOORS Next Generation takes the requirements management capabilities we offer to the next level especially with the foundation of an open collaborative platform. DOORS NG, designed from the ground up to accommodate an ever growing and complex ecosystem, greater need for collaboration and usability for a broader community of stakeholders plans to extend the capabilities for requirements change management & Product Line Engineering. Packaging DOORS NG within DOORS helps our customers to try both the products without purchasing two licenses.
New in Rational DOORS
We have now included four more pre-configured templates supplied from Systems and Software Engineering enabling our customers in kick-starting their projects.
We are continuing our investments in replacing the requirement of installing Rational Publishing Engine (RPE) client for report generation. In DOORS 9.5.1, we have included the support for parameterized RPE templates. Advanced styling and configurations options are also now included.
We have been making some significant improvements in the Open Services for Lifecycle Collaboration (OSLC) front including an enhanced Rational DOORS-Design Manager integration. For this integration, we have used link discovery technique rather than back-linking, thus helping in a better integration - Links will only be stored within the creating application and discovery is done in the background on a real time basis. This investment also helps in improving our 3rd party integrations.
Starting this version of DOORS, we will be using ETL(Extract, Transform and Load) to integrate with Rational Insight. Since this is the method used for integrating RRC and DOORS NG with Insight, the metrics capabilities remain the same across the products. This enables specific metrics defined in DOOR being reused in DOORS Next Generation. Thus one can deploy insight over DOORS 9.x data and while piloting DOORS NG, all the metrics data would be made available in it automatically. Check this article for more details - Improve the value of your CLM reports by using metrics.
Based on feedback from customers, we have made good amount of usability enhancements in 9.5.1. Some of them include
Link preview with OSLC style rich hover on links to understand better traceability navigation
Better navigation into baselines and improvement in management of baselines
Improved support for DOORS table formatting
We have also made some significant improvements to DOORS Web Access. Some of them include
Simplified configuration and deployment
Improvements to Database Explorer
Support for DOORS project view in the Database Explorer now
New in Rational DOORS Next Generation
We have been continuing to make improvements in the product since its first release in November 2012. Our priorities are to focus on product quality and usability and from a long term perspective - on requirements configuration management. Some of the major updates to DOORS Next Generation are
Note: Roadmap and strategies mentioned in this post are subject to change and request you to be in touch with IBM reps to understand the latest road maps
VijaySankar 270000E5JQ Tags:  innovate2013 ibmrational ibm-champion guest rational ibm interview 7,510 Views
Alex Ivanov is a Senior Software Engineer II with Honors at Raytheon Integrated Defense Systems. Alex has more than 10 years experience as a Requirements (DOORS) Database Manager supporting a large scale distributed requirements database in the aerospace and defense industry, specializing in writing re-usable DXL, training, user support and consulting with programs to ensure they get the most out of their use of IBM Rational DOORS. Alex in an IBM Certified Deployment Professional DOORS v9 and has been recognized as a three time IBM Champion (2011, 2012, 2013). In 2011 Alex was elected the President of the New England Rational User Group.
1. How does it feel to be a returning IBM Champion?
The practical application of traceability Part 2: What’s really going on when you plan V&V against a requirement?
VijaySankar 270000E5JQ Tags:  ibm requirements-engineering ibmrational rational requirements traceability requirements-traceability guest 8,692 Views
In this guest blog post, Requirements Engineering Expert, Jeremy Dick continues with his discussion on practical applications of traceability. Read the first part here -
Inspired by recent experience in a large systems engineering project, Part 1 of this essay covered the practice of decomposing requirements, which brings about one of the most important traceability relationships in requirements engineering. Part 2 here covers the next most important relationship: that between requirements and validation and verification activities. Part 3 will continue the discussion of V&V, and how it itself gives rise to further requirements.
Verification & Validation (V&V)
I don’t care enough about the difference between validation and verification to want to enter into the divisive debate about it here. I am just going to say V&V and be done with it!
Kinds of V&V activity
There are many kinds of V&V activity, and organisations have varied ways of classifying them. In the project I am working on, the classifications are Analysis, Analogy, Inspection, Review, Test and Demonstration.
By their very nature, these types of activity tend to occur at different times of the life-cycle. Analysis, for instance, tends to occur early to predict properties of the proposed design and verify it against requirements. By contrast, demonstration tends to occur late as part of the acceptance tests.
Typically, a whole series of activities will be planned against a single requirement, some early, some late, allowing confidence to accumulate over the life-cycle of the project.
Requests for evidence
Despite the variety of kinds of activity, there is one thing they all have in common: they are requests for evidence of some kind or other. Indeed, I would favour calling V&V activities exactly that: “requests for evidence”.
Intention versus Fulfilment
Those activities that are carried out early in the development process provide evidence that the intended design will meet the requirements – they address design intention. Those activities applied late in the development process collect evidence that what has been built meets the requirements – they address design fulfilment.
Once the need for V&V activities has been established, this will often give rise to new requirements, either on the design of the product itself, or requirements for the construction of test artefacts, such as models and test equipment. (We never build just the product; there are always other things that need designing and building that surround the system for various purposes.)
The management of requirements arising from V&V will be the topic of Part 3.
Requirements decomposition and V&V planning
When planning V&V activities against a parent requirement, you need to take into account the V&V that will be carried out on its child requirements, and their child requirements, and so on.
Take, for instance, the following example where a user requirement is decomposed into a number of system requirements:
The only V&V activity planned against the user requirement is a commissioning test, which will occur late in the life-cycle. However, further V&V activities are defined against the child system requirements. Some of these are design inspections that occur very early, and some are system tests that occur relatively late, but still before commissioning.
There is, of course, a sense in which all these V&V activities provide evidence for the satisfaction of the user requirement, but some of the activities fit more directly against the system requirements. So when planning V&V activities, you need to ask the question: what activities can only be carried out against the parent requirement, and which can be delegated to child requirements? – because those that can be delegated are likely to provide evidence earlier in the life-cycle. And you always want that, if you can get it.
Granularity of V&V results
In the above example, there is one V&V activity that is linked to multiple requirements. In general, the relationship between requirements and V&V activities will be many-to-many.
However, this presents an issue when it comes to collating results of V&V against requirements. The System Test defined above may show positive results for filling, boiling and dispensing, but fail on the time taken to recover (cool down). So it has passed on all requirements except one. In terms of granularity of information, we need to record the result of the V&V activity against each linked requirement.
How is it best to do that? The only place to do that in the information model of the example is on the “verifies” links; there is a link for every requirement-V&V pair.
Another way is shown in the next example:
Here we have separated out the success criteria for each requirement for each test by adding subsidiary objects under the V&V activities (for instance, using the DOORS object hierarchy). Each success criterion has exactly one link to a requirement; a link from a criterion is implicitly a link from the V&V activity. (This link could be made explicit by retaining a link from the activity as well – not shown in the diagram.)
Now we have objects rather than links against which to record the results of the V&V activity (using an attribute of that object). This has the added advantage that it encourages a discipline of identifying precisely what the success criterion is for each requirement against each V&V activity. In addition, the V&V Activity and its list of success criteria can be used as a description/checklist for each particular test.
As results come in, the success/failure status on the success criteria can be rolled up through the “verifies” links to the associated requirements, and then on up through the “satisfies” links to the parent requirements. Both these relationships allow results to be summarised through the eyes of the requirements at every level.
V&V planning steps
These are the process steps we teach for planning V&V against requirements. They are numbered so as to continue from the process steps named in Part 1:
So this is what we now teach those engaged in planning and tracing V&V against requirements, in conjunction with requirements decomposition. It is wrong to assume that people will somehow automatically know how to do this kind of thing. By taking this approach, the V&V plan is well organized, defined at the most appropriate layers, with success criteria defined, and ready for the collection and roll-up of results.
Read the first part here - The practical applications of traceability Part 1: What’s really going on when you decompose a requirement?
About the author - Jeremy Dick works as Principal Analyst for Integrate Systems Engineering Ltd in a consultancy, research and thought leadership capacity. He has extensive experience in implementing practical requirements processes in significant organizations, including tool customization, training and mentoring. At Integrate, he has been developing the concept of Evidence-based Development, an extension of his previous work on “rich traceability”. Prior to this appointment, he worked for 9 years in Telelogic (now part of IBM Rational) in the UK Professional Services group as both an international ambassador for Telelogic in the field of requirements management, and a high-level consultant for Telelogic customers wishing to implement requirements management processes. During this time, he developed considerable expertise in customizing DOORS using DXL to support advanced engineering processes. His roles in Telelogic included a position in the DOORS product division to assist in the transfer of field knowledge to the product team. Co-author of a book entitled “Requirements Engineering” that has recently reached its 3rd edition, he is recognized internationally for his work on traceability.Jeremy can be reached out at jeremy.dick[at]integrate.biz
VijaySankar 270000E5JQ Tags:  guest traceability clm ibm collaboration requirements-management requirements_management requirements-engineering analysis agile rational-doors doors requirements-traceability interview requirements communication rational innovate2013 requirements-analysis alm business-analysis ibm-champion 8,109 Views
Today we have with us Mia McCroskey at Emerging Health Montefiore Information Technology who was recognized as IBM Champion this year. She shares with us her thoughts about the requirements management domain.
Welcome to IBM family! It¹s a great pleasure to have you with us as an IBM Champion. Congratulations, How do you feel?
I am honored to be recognized in this way not just this year but for the past several years that I have been asked to present my team's stories at Innovate. It may seem like our little team's requirements management needs are nothing like those of customers with huge DOORS ecosystems. But really we are an R&D site for the evolution of requirements management techniques and strategies. If a member of my team has an interesting idea for how to capture, structure, track, or trace requirements, we can try it without getting high level approvals or disrupting the work of hundreds of people. I take tremendous satisfaction from sharing our successes in a way that may help others improve their best practices.
Can you tell us something about what you do at Emerging Health, Montefiore Information Technology?
Emerging Health is primarily an IT delivery organization supporting healthcare delivery in the Bronx. Montefiore Medical Center is our parent company. My team, Product Development, is a software development shop tucked away in a corner doing very different work from most everyone else. Our application, Clinical Looking Glass, is a browser-based clinical intelligence tool that gives clinicians access to the enormous wealth of patient data gathered by all the other systems. Our end users can get an answer to a question like "are my clinic's diabetic patients getting the level of follow-up care required by our funding sources?" in a few minutes. In most healthcare environments, getting the data that you need to answer this question takes weeks.
My title is Manager, Product Development Lifecycle because I have my fingers on just about every stage of that lifecycle. Specifically, I lead our Requirements Management, Quality Assurance, Education, Support, and Implementation teams. I also manage outsourced development work, manage client relationships, and do hands on end-user training and support. We are an extremely team-oriented organization, with two formal development scrum teams and two teams that are at various stages of adopting an agile process. I'm deeply involved in that right now: it's challenging to apply Scrum to a training and support team, and to a team of data analysts and engineers.
Having said all that, my roots are in requirements. On a team of "happy path" stakeholders who get very excited by ideas for new functionality, I love to dig in and find the challenges that nobody wants to think about -- before we start coding. Sometimes I feel like a real buzz kill!
What are your thoughts on managing requirements effectively?
Agile models have forced us to completely reorganize our requirements elicitation, analysis, and management processes. But one thing that has not changed is my belief in a comprehensive, functionally organized requirements model.
But when I say "comprehensive," I don't mean laboriously detailed. The art of requirements analysis and management is in knowing -- or guessing right -- what details are going to be important later. By later I don't mean to the coders and testers in this iteration, I mean when we want to revise or augment the feature in six months. The requirements analyst has to be deeply plugged in to the business goals and vision in order to predict the future and capture the least amount, but most important information about what the team is doing.
A few years ago I spent many months constructing an "as built" specification for a market data system at the New York Stock Exchange. I had the original ten-year-old spec and a few dozen incremental release documents. We were getting ready to refactor the system and nobody knew every business rule and function. I vowed that no system I worked on would ever lack a spec that described everything it currently did. Incremental requirements specs that aren't integrated into the overall system are defects waiting to be discovered once the coders get ahold of them.
Having argued that I must add that a requirements model in document form is pretty nearly impossible to maintain in the way I describe. You've got to employ a database tool that supports the granularity of each requirement and allows you to describe each one through attributes. Then you can use filters and queries and views to present an infinite number of customized specifications -- all of the requirements implemented in a specific release, or all of the requirements related to a specific functional area, or the completed requirements related to a specific business objective or corporate mandate.
I recently spoke with a software development professional who was very proud of his organization's highly structured and detailed requirements templates that captured every detail before any work began "to be sure we deliver what's wanted." I felt like I was talking to tyrannosaurus rex. We all know that the day after you baseline that 400-page spec it's already out of date.
Agile with its short increments and "only write down what you really need to" mentality can seem seductively freeing. When our organization adopted Scrum, I stuck to my requirements model guns, and sure enough a few months later we couldn't remember decisions that we'd made a few sprints back, nor even exactly which sprint we'd done the work in. Since we weren't supposed to be doing "heavy" requirements, we'd been coached to: use the system and see what happens; or sift through dozens of completed user stories and hope the detail we wanted was actually mentioned in the acceptance criteria (which it was not because it was an in-sprint decision); or try to find relevant test cases and check the expected results. Instead, I launched DOORS, went to the functional area related to the question, and checked our documented business rule. Done.
What are some of the challenges you see in Healthcare Informatics projects?
Deriving meaningful information from the electronic medical record is essential to justifying the cost of those systems. We're piloting the use of predictive analytics -- combining statistical methods with the mass of patient data collected every day at our parent medical center -- to predict outcomes at the population level. To do it you need a very wide range of data: blood pressure, height, and weight, smoking patterns, history of heart disease, current blood sugar level, and on and on. Just bringing all this data together is the first challenge. Next is the analytic tool -- that's CLG. Finally you need big iron to process it. Most local and regional healthcare providers don't have the funding for, say, Watson. My team spent last summer optimizing our hardware and software environment, and CLG itself, to handle analysis of larger data sets faster, but within a medical-center friendly infrastructure budget.
Another area of critical concern is patient information. The need to pool patient data for direct care as well as population research is supported by legislature and funding sources. But we are bound, both legally and ethically, to protect patient identity in every circumstance.
Clinical Looking Glass has the capacity to show patient contact information to users who have been granted permission to see it -- usually clinicians who are actively providing care and need to contact the patients. While this is a critical feature of CLG, we expect to have to develop more granular levels of access as new types of clients adopt the product. For example, our Regional Health Information Organization (RHIO) client has data from twenty-two healthcare institutions. Some patients have declined to have their identity shared across the organization. We have to build the capability to mask these patients' identity even to our users who have permission to see it.