Modified on by VijaySankar
We had a fantastic DOORS customer webcast panel on September 26 with experts from across the industries talking about their experience in using IBM Rational DOORS for requirements management. If we have to take one success metric for the webcast; it was that we overshoot by half an hour from the designated one hour for the discussion because we had some wonderful discussions going on. Our panelists have graciously agreed to respond to the questions on an offline basis, and we are publishing the answers here.
If you missed this golden opportunity and is wondering whether you can get another chance? YES! Or you would like to listen to it again; we have posted the replay here. You can listen to it anytime now. Register here
How to copy the objects with in-links and out-links from one module to another module?
< >Use DXL scripts that capture link information, which can be edited if necessary, and then copy the objects, and then run another dxl script to re-create the links.
Paul Lusardi has shared with us couple of samples. If you are interested, contact us
What factors determine using new database vs a new project in a database to differentiate/segregate work?
< >The biggest criteria for using the same database is Linking. If there are Projects in a database that you need to link to, the new Project should be in that same DB.The default should be to use the same database so that you only have to maintain one set of users, groups, standards, etc…. the possible exception is if you have 2 distinct sets of security measures (HIPAA, DoD or DoE cleared data) then you will want to segregate that out to a different server.
Does anyone on Panel publish documents directly from DOORS without Add-on applications?
< >[Patrick] I use DXL scripts which export the data out to RTF with some typical publishing formatting
[Mia] < >I do. Whether you can depends on the demand for documents (do you need to do them once a week? Daily? Quarterly?) and how critical their formatting is to your organization. We have configured views in most modules for output. A simple one has just the object number and the Object Heading/Object Text. For a full spec, we include trace columns for one or more linked modules, with object identifiers and even object headers or text in those columns. In the output you get the main object from the current module, followed by all the objects it’s linked to. I keep in a label that specifies “in link” or “out link.”I used to painstakingly populate the Paragraph Style attribute in order to map to MS Word, but because our need for documents is very low I have stopped enforcing this. We have a set of MS Word templates with the title page, toc, other front matter, and a standard set of styles. Plain Vanilla DOORS maps to basic Word styles fairly well.I pick the appropriate view and filters and select Export to Word. I select the appropriate Word template, and let it rip. I then do some minimal post processing, like getting rid of the object IDs on headings. Obviously if you’re doing very long documents this would be impractical.In my previous job we had interns create a DXL script that handled the post processing – you can get to something that’s usable without a huge investment.
How do you deal with sharing of your database with the government or other contractors? Do you share read-only partitions ?
[Paul] < >Previous to coming to NuScale, I co-developed the Unified RM Database where we allowed outside access, using access control and company VPN login credentials to enable multi-company development and reviewer teams to look at the data.The questions would be …..do you want them to see work in progress? Or just a snapshot of baselined work? Perhaps a separate server with a restored baseline set would be best in that case. I will defer to an expert on baseline sets though
How do you get rid of allowing DOORS Links?
[Patrick] We use a link schema with Pairings as necessary, and then create and delete but not purge the DOORS Links module with a copy in every sub-Project/Folder, and leave it as the default Link Module. That way, users cannot create a link that is not intended between any module pair.
Paul Lusardi has shared us a presentation. If you are interested, contact us
Is there a "Copy View" script available?
Paul Lusardi has shared us a presentation. If you are interested, contact us
Please ask about doing risk assessment in DOORS: Do any of the panelists document FMEA or risk analysis in DOORS? How do they do it?
< >[Paul] Yes and no. Textbook FMEA? No. using DOORS to analyze potential risks by means of looking at risk informed design? Yes, this is a “developing” activity[Patrick] We capture the risk status in DOORS, but not the actual assessment data
Do any of the panelists integrate DOORS with other tools like Rational Quality Manager?
< >[Patrick] We are looking into it, but have not actually started using in Production. Going to deploy DOORS/CQ by year end.[Paul] Only RPE integrations are used here
How has your work differed because you use SCRUM vs. when you did not?
Our pre-sprint requirements work is at a higher level – that is, not down in the functional weeds. We spend more time exploring new features from the user’s perspective, and more time on storyboarding and similar activities. The features we’ve built since going to agile are much more user-friendly and have much better workflows than the older stuff, where much of the workflow and usability decisions were left to the engineers.
Detailed functional analysis of requirements happens during the sprint, with the requirements analyst on the team working closely with the engineers and testers. For very complex features, this analysis process will begin before the sprint so that we have enough functional detail for the team to estimate the work. We have a small backlog grooming team that, members of the scrum teams, who help the requirements analyst with this.
For Mia, for your agile, how many full-time admins needed to maintain DOORS?
[Mia] We’re very small, so we don’t even have one full time admin. Our system engineer handles the environment, and I handle user administration. It’s a tiny part of each of our jobs. I don’t see that our approach represents a particularly different administrative need than any other. If we had a hundred users and multiple DOORS databases, then we’d dedicated administrative resources just like any other implementation.
@Mia - Are requirements for HW to SW interfaces traced through ICD(s) in DOORS?
We’re software only, so no HW/SW interfaces. However, I strongly support the use of interface specifications between software systems and have used an interface module to sit in between systems that need to interact. The trick is to stay on the requirements side of the requirements/design line. The cost of maintenance on such a model is high – probably too heavy for our agile process to maintain. However, should we need to support integration with an external application (one that our teams are not developing), our API requirements will undergo close scrutiny and possibly be carved out into a separate module so that we can use trace links to do impact analysis on external “clients.”
Do you DOORS lend itself to any specific Devops , ALM methodology liek Agile etc?
We adopted a continuous deployment model this year and it had no impact on our use of DOORS. That is, DOORS is still a critical part of our development tooling. We were already capturing release version information for requirements in DOORS, and this information remains very useful in understanding what feature set exists in a given version of our application.
Try IBM Rational DOORS in a Sandbox
Webcast - Achieving sustainable requirements across the supply chain with IBM Rational DOORS
A leading analyst and Systems Engineering expert, David Norfolk of Bloor Research recently published a white paper titled Reducing the risk of development failure with cost-effective capture and management of requirements. In this report, David dwells into the relevance of requirements management as a discipline and put forth his views on how the domain is changing with the advent of new development paradigms such as agile, mobile and devops.
If enterprise architecture helps to bridge the gap between business strategy and vision and its implementation in technology, from the CEO’s point of view, Requirements Management continues to help bridge the gap between business and technology at a lower level.
The report provides valuable insights with his deep coverage about why requirements management is relevant even more today, issues associated with managing requirements, challenges faced by the discipline, best practices from his experience and his thoughts on the capabilities of an ideal requirements engineering tool. Some of the topics the whitepaper discusses about are --
Challenges in requirements management
Managing changing requirements
Scope and ideal capabilities of a requirements engineering tool
Real life examples of benefits from investing in requirements
Read the whitepaper here - Reducing the risk of development failure with cost-effective capture and management of requirements
Modified on by VijaySankar
This year, Innovate - The IBM Technical Summit will be held from June 2 to 6 at Orlando, Florida. As in previous editions, we will be having two tracks dedicated to requirements management at the event. One track would be focusing on requirements definition and management for IT/Application development and the other track will be focusing on Requirements Engineering for Systems and Product development. Each track will have 16 sessions starting with the individual track kickoffs. This year the conference will witness personalities like Steve Wozniak, Eric Ries and many other.
The requirements management track will showcase more than 25 real life case studies and best practices from across the industries ranging from Banking to Insurance to Aerospace & Defense to Energy & Utilities. Some of the interesting sessions to watch out for are
Agile Requirements: Maintaining the Model Through Iterations (by Mia McCroskey, IBM Champion at Emerging Healthcare IT)
DOORS Next Generation: 1st Impressions and Key Comparisons (to IBM Rational DOORS) [by Alex Ivanov, IBM Champion at Raytheon]
Successful RRC adoption through a Community of Practice: Case Study from Blue Cross Blue Shield of North Carolina
Cerner Corporation: Migrating from Requisite Pro to RRC
Piloting Rational Requirements Composer for Enterprise-wide deployment: The Good, the Bad, and the Ugly (Fidelity)
Managing Parallel Streams of Requirements in DOORS at an Automotive OEM
A DOORS-based solution for semi-automated risk analysis within MedTech
We also have two instructor led workshops for Rational DOORS and Rational Requirements Composer. Learn more here. Demo booths, self paced tutorial labs for DOORS, DOORS Next Generation and RRC are also organized during the event.
Few other attractions for anyone interested in the field of requirements management are panel discussion, Meet the developer sessions, sessions from organizations like IAG Consulting, IIBA and INCHRON.
Every year Innovate attracts 4000+ professionals from across the world. Join us for one of the biggest technical conferences to learn, meet and network with like minded technical professionals. For more details about the conference visit Innovate 2013 - The IBM Technical Summit
Here is the detailed agenda of requirements management topics at Innovate -
ps: agenda subject to change
Modified on by VijaySankar
Jeremy Dick works as Principal Analyst for Integrate Systems Engineering Ltd in a consultancy, research and thought leadership capacity. He has extensive experience in implementing practical requirements processes in significant organizations, including tool customization, training and mentoring. At Integrate, he has been developing the concept of Evidence-based Development, an extension of his previous work on “rich traceability”. Prior to this appointment, he worked for 9 years in Telelogic (now part of IBM Rational) in the UK Professional Services group as both an international ambassador for Telelogic in the field of requirements management, and a high-level consultant for Telelogic customers wishing to implement requirements management processes. During this time, he developed considerable expertise in customizing DOORS using DXL to support advanced engineering processes. His roles in Telelogic included a position in the DOORS product division to assist in the transfer of field knowledge to the product team. Co-author of a book entitled “Requirements Engineering” that has recently reached its 3rd edition, he is recognized internationally for his work on traceability.Jeremy can be reached out at jeremy.dick[at]integrate.biz
It is a bit of a shock to find myself well into the fourth year on the same project! The nature of my work as a consultant means that it is rare for me to stick with a project beyond the initial phases of defining a requirements management process, establishing effective tool support and training the process enactors. But this time we have been able to stick with the requirements team supporting a large project long enough to see theory put into practice, and to see what it really means to apply the tools and techniques. We have gone well beyond just training, and find ourselves mentoring nearly 300 engineers in the application of DOORS for requirements capture, development and management. This has helped us keep our feet firmly on the ground – rubber on the road – and to walk with those who actually have to do the work.
So what have we learnt?
It is one thing to teach people how to write requirements statements that are clear, unambiguous, testable and traceable; it is quite another thing to help people understand how to take a requirement and develop it. None of the engineers we met on the project had previous experience of how to take a system requirement, for instance, and systematically decompose it through the design into sub-system and component requirements. We had to adapt our training and mentoring to address this skill.
Requirements decomposition establishes one of the essential requirements traceability relationships: how each layer of requirements contributes to the satisfaction of the layer above. This is often known as the satisfaction relationship, or as refinement in SysML. It is this relationship that connects the design to the development of requirements, and that lies at the heart of the ability to perform impact analysis.
Whatever layer you are engaged in – customer, system, sub-system or component – the same basic requirements development process can be applied. These are the process steps we teach for requirements development:
1. Collect and agree your requirements.
Here you actively seek out the requirements you are expected to fulfil. In an ideal world, development will be perfectly top-down, and you can wait for the layers above to allocate perfectly expressed requirements to you. In a practical world, you will have to be more proactive, and will have to cooperate with your requirement “customers” to obtain acceptably worded requirements.
2. Design against your requirements.
This stage is the creative bit that you perhaps most enjoy doing, and where the real engineering takes place. You will imagine how to design the system to meet the requirements, and what you need your requirement “suppliers” to do to contribute to your design. In other words, if you are at the system level, you design the system into sub-systems, and work out what each sub-system must do to meet the system requirements.
3. Decompose the requirements to reflect the design.
Now you enter the decomposed requirements and trace them back to the requirements they satisfy, thus making the requirements contents and traceability reflect your design. The wording of the decomposed requirements is important: if the original requirement read “The <system> shall ...”, then it is likely that the decomposed requirements will read “The <sub-system> shall ...” At this stage you can also capture rationale for the decomposition, including references to the design documentation or models, thus tracing also to design information.
4. Allocate the decomposed requirements.
Finally, you can pass on the decomposed requirements to those areas responsible for fulfilling them. These other areas engage in the same process, and together you systematically achieve alignment of requirements through all the layers.
The example below illustrates the end result of applying this process on a user requirement decomposed into system requirements. The rounded box contains the design rationale, and refers to a functional model of the product. (If you’ll forgive the shameless plug, such diagrams can be produced using a DOORS extension related to TraceLine. Ask me more if you are interested.)
The example just shown is a classic decomposition pattern. It is actually the decomposition of an overall performance into a combination of capacity and performance attributes of the product. We call this “decomposed”.
Other decomposition patterns are possible. Sometimes no decomposition is necessary, because the system requirement can be satisfied entirely by a single component or sub-system, as in the left-hand example below; or a constraint that will apply universally to all parts, as in the right-hand example. All that changes, perhaps, is the wording of the requirement to indicate the new target. We call this “direct flow”.
You will reach a point in the cascade of decomposed requirements when a requirement is satisfied entirely within the current area without the need to decompose further, as illustrated in the following. In this case, it is important to state the rationale for not flowing the requirement onwards, as otherwise it may be construed as a traceability gap. We call this “not developed further”.
I have seen a number of organisations that capture this flow-down type – Decomposed/Direct Flow/Not developed further – as an attribute of the requirement. We do this because it allows us to cross-check certain things: if you have marked a requirement as “Decomposed” but have not decomposed it, then that does indicate a design gap. However, if you mark the requirement as “Not developed further”, that gives you permission not to trace it further, i.e. not a design gap. (But you would do well to provide rationale!)
As requirements flow down through the layers, the complexity of the design becomes evident in the shape of the requirements graph. In general, satisfaction is a many-to-may relationship between requirements, and the figure below shows how this may be manifest. As the requirements are decomposed, they are refactored through the design.
Some patterns are questionable. Take these for instance:
Why decompose something into three requirements only to reduce them to a single one again? Or why collapse three requirements into one only to re-expand them in the next layer?
These patterns are not necessarily wrong, but they should be targeted for careful review.
So this is what we now teach those engaged in requirements decomposition, flow-down and traceability. It is wrong to assume that people will somehow automatically know how to do this kind of thing. By taking this approach, the flow-down of requirements reflects the design, and a clear satisfaction relationship is expressed in the traceability.
Read the second part here - The practical application of traceability Part 2: What’s really going on when you plan V&V against a requirement?
Requirements come in a variety of forms and level of details. Well defined and thought out requirements have demonstrated time and time again to reduce development time, increase quality and lower the cost of a software project. Projects that lack specific requirements quickly fall into many of the typical traps of project failure.
In this era of agile, mobile and ever faster-paced world of today, some teams question whether managing requirements are still relevant today. Well I say teams still need to know what they are supposed to do. The developer may have ideas but in order to deliver something useful there must be a consensus on functionality and needs. And they need more than just a general idea scribbled on a sticky note, but specifically exactly what does the business and customer expect. Developers should not be expected to fill in the gaps with assumptions. Clear definitions of requirements actually accelerate the development process by reducing uncertainty and wasteful work.
Another point of view argues that requirements only have to be defined once at the start of a project , and not worry about them again during development. Sure you could consider that approach but then we quickly realize that change is all around us. There is no way we can ever realistically say the requirements are "done". Requirements are continually evolving even as development is underway and therefore the team needs to stay attuned and make adjustments along the way. Otherwise the risk is that the end result although matching up to the original requests, fails to meet the business need upon delivery. The only way to address this effectively is to realize that requirements are an intricate part of the lifecycle that continually evolve with the business and that the delivery team needs to be in sync to deliver these expected results.
To ensure requirements are written for the entire lifecycle it is important to realize that business needs come in many different forms, formats and language. Bringing them all together in a single place, removing redundancy, and connecting interrelated content is the first step to requirements definition and management. Capturing your stakeholder and user requirements is only the first step in realizing your project. As your project progresses and evolves, these initial requirements give rise to other requirements which elaborate, satisfy, and verify them. The result is a web of interdependent artifacts and relationships which represent the nature of the dependencies between these artifacts. Navigating and understanding this web of information is critical to the success of your project. Each stakeholder in a project may be interested in different information (e.g. defect backlog, current tasks, or release schedule). What helps here is a customizable dashboard displaying all the information stakeholders, developers and testers need at any time. As a new project is started and progresses forward many of requirements you define and capture will change, be added to, or sometimes removed through the course of development. You need the capability to track requirements information changes so that you can understand what has taken place throughout that process.
To learn about automated solutions for managing requirements, I recommend listening to this informative webcast Three Reasons to Throw Away your Requirements Documents
In summary I want to remind business analysts and the rest of the software delivery teams, that in this fast-paced delivery environment today, requirements are more important than ever. They provide the roadmap of exactly what is expected of the team by the business and customers. Everyone on the team needs to appreciate that requirements are never final and that they naturally evolve with the market and business. Teams that can capture evolving requirements and implement them in a rapid and iterative fashion will find a competitive edge in the market. Automating requirements provides the additional key benefit of traceability and collaboration to identify gaps and the impact of changes while facilitating the engagement of experts on the team to improve the quality of results.
Modified on by VijaySankar
Just like the famous (mis)quote of Mark Twain, rumors of the demise of the requirements management tool IBM Rational DOORS are not only exaggerations but in fact untrue. I’d like to put straight here some of the myths and misunderstanding that I’ve heard and seen perpetuated:
Myth: IBM is discontinuing support and development for the DOORS 9.x series
Truth: IBM continues to develop the DOORS 9.x series with the same level of development resources. We have new functionality to announce in 2013 and plan further releases in the years ahead. In addition IBM has recently invested additional development resources in creating a new Requirements Management tool called IBM Rational DOORS Next Generation (DOORS NG).
Myth: IBM is replacing the DOORS 9.x series with DOORS Next Generation (DOORS NG) and expects customers to migrate now
Truth: DOORS 9.x will be developed in parallel with DOORS NG for many years to come. DOORS NG was released for the first time in November 2012. The tool has many functions DOORS 9.5 does not have, e.g. fully functional web client, type and data re-use (requirements & attributes), team collaboration, task management, but also does not have all of the capabilities relied on by DOORS 9.x users. We encourage users to evaluate the capabilities of DOORS NG and start projects or move projects to DOORS NG when practical and beneficial to the project.
Myth: To move to DOORS Next Generation, I’ll need to buy a whole new tool
Truth: Customers with active support & subscription on their DOORS 9.x licenses are entitled to use DOORS Next Generation - it is included as part of the DOORS 9.x package. Customers can choose to use either DOORS 9.x, DOORS Next Generation or a combination of both. There is no need for additional purchases or even for an exchange of licenses. All licenses are available to customers in the IBM license key center.
Myth: IBM offer no migration path from DOORS 9.x to DOORS NG
Truth: IBM do offer functions to transition into using DOORS NG for existing DOORS 9.x customers:
Work with DOORS 9.x and DOORS NG alongside each other - both tools support linking of information between both databases.
Work with suppliers by exchanging requirements through the standard exchange format of ReqIF. This works best between DOORS 9.x and DOORS NG but is also designed to work with other RM tools.
Where DOORS NG projects wish to be initialized with data from DOORS 9.x we offer re-factoring functions to harmonize the transition of data from DOORS 9.x to DOORS NG.
Myth: Everyone knows IBM’s strategy on requirements management and DOORS
Truth: IBM takes great care to regularly communicate our product strategy and roadmap at trade shows, IBM conferences and on webcasts. If in doubt, come and ask IBM! Please submit questions using the comment feature here or contact your IBM representative.
For information about the products, visit -
IBM Rational DOORS
IBM Rational DOORS Next Generation
Today we have with us Mia McCroskey at Emerging Health Montefiore Information Technology who was recognized as IBM Champion this year. She shares with us her thoughts about the requirements management domain.
Welcome to IBM family! It¹s a great pleasure to have you with us as an IBM Champion. Congratulations, How do you feel?
I am honored to be recognized in this way not just this year but for the past several years that I have been asked to present my team's stories at Innovate. It may seem like our little team's requirements management needs are nothing like those of customers with huge DOORS ecosystems. But really we are an R&D site for the evolution of requirements management techniques and strategies. If a member of my team has an interesting idea for how to capture, structure, track, or trace requirements, we can try it without getting high level approvals or disrupting the work of hundreds of people. I take tremendous satisfaction from sharing our successes in a way that may help others improve their best practices.
Can you tell us something about what you do at Emerging Health, Montefiore Information Technology?
Emerging Health is primarily an IT delivery organization supporting healthcare delivery in the Bronx. Montefiore Medical Center is our parent company. My team, Product Development, is a software development shop tucked away in a corner doing very different work from most everyone else. Our application, Clinical Looking Glass, is a browser-based clinical intelligence tool that gives clinicians access to the enormous wealth of patient data gathered by all the other systems. Our end users can get an answer to a question like "are my clinic's diabetic patients getting the level of follow-up care required by our funding sources?" in a few minutes. In most healthcare environments, getting the data that you need to answer this question takes weeks.
My title is Manager, Product Development Lifecycle because I have my fingers on just about every stage of that lifecycle. Specifically, I lead our Requirements Management, Quality Assurance, Education, Support, and Implementation teams. I also manage outsourced development work, manage client relationships, and do hands on end-user training and support. We are an extremely team-oriented organization, with two formal development scrum teams and two teams that are at various stages of adopting an agile process. I'm deeply involved in that right now: it's challenging to apply Scrum to a training and support team, and to a team of data analysts and engineers.
Having said all that, my roots are in requirements. On a team of "happy path" stakeholders who get very excited by ideas for new functionality, I love to dig in and find the challenges that nobody wants to think about -- before we start coding. Sometimes I feel like a real buzz kill!
What are your thoughts on managing requirements effectively?
Agile models have forced us to completely reorganize our requirements elicitation, analysis, and management processes. But one thing that has not changed is my belief in a comprehensive, functionally organized requirements model.
But when I say "comprehensive," I don't mean laboriously detailed. The art of requirements analysis and management is in knowing -- or guessing right -- what details are going to be important later. By later I don't mean to the coders and testers in this iteration, I mean when we want to revise or augment the feature in six months. The requirements analyst has to be deeply plugged in to the business goals and vision in order to predict the future and capture the least amount, but most important information about what the team is doing.
A few years ago I spent many months constructing an "as built" specification for a market data system at the New York Stock Exchange. I had the original ten-year-old spec and a few dozen incremental release documents. We were getting ready to refactor the system and nobody knew every business rule and function. I vowed that no system I worked on would ever lack a spec that described everything it currently did. Incremental requirements specs that aren't integrated into the overall system are defects waiting to be discovered once the coders get ahold of them.
Having argued that I must add that a requirements model in document form is pretty nearly impossible to maintain in the way I describe. You've got to employ a database tool that supports the granularity of each requirement and allows you to describe each one through attributes. Then you can use filters and queries and views to present an infinite number of customized specifications -- all of the requirements implemented in a specific release, or all of the requirements related to a specific functional area, or the completed requirements related to a specific business objective or corporate mandate.
What are your thoughts on the role of requirements management in agile projects?
I recently spoke with a software development professional who was very proud of his organization's highly structured and detailed requirements templates that captured every detail before any work began "to be sure we deliver what's wanted." I felt like I was talking to tyrannosaurus rex. We all know that the day after you baseline that 400-page spec it's already out of date.
Agile with its short increments and "only write down what you really need to" mentality can seem seductively freeing. When our organization adopted Scrum, I stuck to my requirements model guns, and sure enough a few months later we couldn't remember decisions that we'd made a few sprints back, nor even exactly which sprint we'd done the work in. Since we weren't supposed to be doing "heavy" requirements, we'd been coached to: use the system and see what happens; or sift through dozens of completed user stories and hope the detail we wanted was actually mentioned in the acceptance criteria (which it was not because it was an in-sprint decision); or try to find relevant test cases and check the expected results. Instead, I launched DOORS, went to the functional area related to the question, and checked our documented business rule. Done.
What are some of the challenges you see in Healthcare Informatics projects?
Deriving meaningful information from the electronic medical record is essential to justifying the cost of those systems. We're piloting the use of predictive analytics -- combining statistical methods with the mass of patient data collected every day at our parent medical center -- to predict outcomes at the population level. To do it you need a very wide range of data: blood pressure, height, and weight, smoking patterns, history of heart disease, current blood sugar level, and on and on. Just bringing all this data together is the first challenge. Next is the analytic tool -- that's CLG. Finally you need big iron to process it. Most local and regional healthcare providers don't have the funding for, say, Watson. My team spent last summer optimizing our hardware and software environment, and CLG itself, to handle analysis of larger data sets faster, but within a medical-center friendly infrastructure budget.
Another area of critical concern is patient information. The need to pool patient data for direct care as well as population research is supported by legislature and funding sources. But we are bound, both legally and ethically, to protect patient identity in every circumstance.
Clinical Looking Glass has the capacity to show patient contact information to users who have been granted permission to see it -- usually clinicians who are actively providing care and need to contact the patients. While this is a critical feature of CLG, we expect to have to develop more granular levels of access as new types of clients adopt the product. For example, our Regional Health Information Organization (RHIO) client has data from twenty-two healthcare institutions. Some patients have declined to have their identity shared across the organization. We have to build the capability to mask these patients' identity even to our users who have permission to see it.
On Thursday 14 March I presented on an IBM sponsored Dr Dobbs webcast on the topic of ‘3 Reasons to Throw Away Your Requirements Documents’. If you didn’t catch the webcast, you can view it on-demand and download the slides. If you did attend I hope you enjoyed it and in this blog post I want to answer some of the questions that I didn't get time to cover in the webcast, so scroll down to see if I've covered your question here. But first, for those of you who didn't make it, here's a quick recap of points I made during the webcast:
- What do I mean by ‘Throw Away Your Requirements Documents’? I’m not saying do away with your requirements or your requirements management process. What am I saying is invest in improving your requirements management process and support those process improvements with the right tooling. In the webcast audience poll, 60% said they are using documents and/or spreadsheets for requirements, so I set out to provide the case for why they should consider moving to a requirements management tool.
- Reason #1 – Collaboration. Using documents and spreadsheets, you can get into all sorts of problems like working from the wrong version of requirements. An integrated requirements management tool (one that has a collaborative repository that has open interfaces to share requirements data with design, test and development environments) helps you ensure that all engineering / development team members and their stakeholders are working from the right version and view of requirements for their role. I shared information from a case study on MBDA Missile Systems where collaboration was a major challenge.
- Reason #2 – Traceability. Traceability helps you get context and an audit trail for decisions made, perform coverage analysis, detect gold plating and perform impact analysis (See blog posts ‘What is Traceability?’ and ‘The uses and value of traceability’ for more explanation). But using documents and spreadsheets to create and manage traceability gives it a bad name – it becomes a tedious, error-prone overhead activity. With an integrated requirements management tool traceability links can be easily created and navigated. Traceability becomes part of the process rather than a bulk catch-up exercise, and through open interfaces it becomes easy to extend traceability from requirements to artifacts in other tools such as designs, work items and test cases. And most importantly there are automated views and reports that enable you to make active use of traceability for the purposes I mentioned above. I shared information from a case study on Invensys Rail Dimetronic where traceability is essential.
- Reason #3 – Agility. As many engineering and development organizations are looking to improve time to market and reduce costs, agile approaches are becoming increasingly popular (in the webcast audience poll 56% said they were using some sort of hybrid waterfall/iterative approach and 20% were agile), and with that comes changes to the way requirements might have been traditionally defined and most importantly to the way that changes to requirements are managed. Change is allowed and encouraged but you must have reviews and impact analysis to make informed decisions about whether to accept the change and implement in the current iteration or sprint, plan it into a future iteration or reject the change altogether. And to do that requires more than just a backlog managed in a spreadsheet or an agile planning tool. At IBM Rational we keep a prioritized backlog of user stories and epics as work items in plans managed in Rational Team Concert. User stories and epics are then decomposed and more fully described by requirements and supporting visual and textual artifacts created and managed in Rational DOORS Next Generation.
- And I started and concluded by looking at the most compelling reason for improving your requirements management process with the support of the right tooling: Return on Investment. I shared information from a case study on Emerging Health, Montefiore IT where they a achieved, among other benefits, a 69% reduction in the cost of quality (test preparation, testing and rework) within 6 months of deploying an improved requirements management process supported by IBM Rational DOORS. There’s also a follow-up video to this case study that was recorded after the client had adopted agile development practices.
Ok now onto some of the questions that were submitted but I didn’t time to answer during the webcast. I’ve divided them into sections based on the type of question so you can easily scan down to topics you’re interested in:
Alternative approaches to managing requirements
Q. What's the advantage of a requirements management tool if I could link my change management tickets to fine grained requirements (use cases, storyboards) that are maintained on a wiki or other collaboration tool (e.g. Sharepoint or IBM Connections)? It would seem that the work items would give you the needed traceability and metrics?
A. While you might be able to create the level of traceability you require, how would you report on it? Would you have to build bespoke views and reports? And would the use of a wiki for ‘fine grained requirements’ provide you with a view of requirements in context with one another as opposed to individual wiki pages? Where would you document additional properties of requirements? Could you easily reuse requirements across projects? A requirements management tool like IBM Rational DOORS Next Generation provides built-in traceability views and reports, enables to you to structure requirements in context with another in document-like views, provides user-defined properties for recording additional information and facilitates reuse of requirements.
Requirements Management and Agile Development
Q. We are mostly using water fall model, but looking into checking if Agile works for our customers. I am very interested in the role of a Requirements Analyst in the Agile world. In our world, the analyst is a facilitator of requirements gathering and not a SME on the business application we are gathering requirements for.
A. That facilitation role and the analysis skills of the Requirements or Business Analyst are still essential in Agile development. A common mistake when moving to moving to more agile approaches is appointing a ‘Product Owner’, who is a subject matter expert in the business domain you are building an application or product for, and expecting them to write the requirements or ‘user stories’. While they are experts in the business and you need that expertise on hand for Agile to work, they are not usually skilled in getting to the root goals or needs of the business problems you’re looking to address with the product or application. Without the skills of the Requirements or Business Analyst, it’s all too easy for the user stories to become about automating the way things are done today rather than addressing the real business issues in an optimal way. You can read more about the role of the analyst in Agile development in an interview with Mary Gorman of EBG Consulting.
Q. How does the IBM requirements toolset compare to more "agile" focused toolsets like Atlassian, Rally, etc.?
A. IBM is very committed to supporting agile development, particularly when agile is scaled to large, distributed development teams. At the heart of our agile development capabilities is Rational Team Concert which supports agile planning, task management and change management. As stated in the webcast though, we’ve found in our own continuous delivery process and heard from clients, that a place is needed to capture more details of requirements and their associated properties, in context with one another, together with supporting artifacts like storyboards, workflow scenarios and use cases. The integration of Rational Team Concert with Rational DOORS Next Generation provides those additional capabilities while preserving traceability to user stories and epics managed in the product backlog.
IBM Requirements Management solutions
Q. It seems this discussion is on Rational DOORS. Is Rational Requisite Pro still offered? If so then how do these two products compare?
A. IBM continues to support, maintain and respond to enhancement requests for Requisite Pro, but our future direction for requirements management for IT application development lies with Rational Requirements Composer and we provide migration support and a trade-up program. Please contact your IBM representative for more details. If you are using Requisite Pro for requirements management for complex products or embedded systems development, then you might also want to look at whether Rational DOORS or Rational DOORS Next Generation is the right move forward for your organization. But don’t worry we’re not forcing you to migrate today.
Q. You've mentioned DOORS Next Generation in your presentation, what is that? Does it replace DOORS?
A. IBM Rational DOORS Next Generation is a requirements management application on a collaborative lifecycle management platform for systems and software engineering that provides requirements collaboration, planning, reuse and lifecycle traceability. DOORS Next Generation (DOORS NG) was introduced in 2012 to take advantage of the common, collaborative ‘Jazz’ platform shared by Rational Team Concert, Rational Quality Manager, Rational Design Manager and Rational Requirements Composer; and to extend the requirements management capabilities in Requirements Composer to meet the needs of product & systems development organizations developing complex and/or embedded systems. DOORS NG will enable IBM to introduce new capabilities, faster, that would have been more difficult to deliver with existing DOORS product. However DOORS NG does not replace DOORS today. We have a very large install base of DOORS users working on programs that can last tens of years, and we will continue to support, maintain and enhance the DOORS 9.x series to meet the needs of those users. We encourage existing DOORS users to take a look at DOORS NG, to try it out on pilot projects and use the interoperability capabilities to exchange and/or link data with DOORS 9.x. Existing DOORS customers with active support & subscription are entitled to use DOORS NG without requiring an additional purchase – you can use your existing DOORS license entitlements to use with either DOORS or DOORS NG or a combination of the two. But we are not telling existing DOORS users to migrate live projects to DOORS NG today. DOORS NG in it’s early releases is attractive to organizations who don’t currently use a requirements management tool and are looking for a web based solution that also offers common platform integration with change management, agile planning, test management and design management capabilities. You can download a trial and follow development plans for DOORS NG on Jazz.net.
Q. How does Rational DOORS Next Generation compare to Rational Requirements Composer?
A. DOORS Next Generation is based on Rational Requirements Composer but extended to meet the requirements management needs of product & systems development organizations developing complex and/or embedded systems. In the first releases of DOORS NG, the web client is identical to Requirements Composer, but DOORS NG also features a rich client which is designed for usability of editing large requirements specifications. The strategy for the two products is that while Requirements Composer will be focused on the needs of business analysis and IT application development teams, DOORS NG will be focused on the needs of systems engineers and product & systems development teams.
I’d like to wrap up this post by thanking all of you who attended the webcast, participated in the polls, asked some great questions and completed the survey feedback. If you missed it, you can catch a replay or download the slides. I’d welcome any additional comments or questions here.
Today we have with us, Theresa Kratschmer writing about the importance of metrics in requirements management. Theresa Kratschmer is a senior software engineer who joined IBM T.J. Watson Research in 1996. There she worked on defect analysis, requirements, and Orthogonal Defect Classification deployment. In 2010, Theresa moved to sales where she is a technical specialist focusing on the Rational Jazz products. Previous to joining IBM, Theresa developed software for real-time, graphics, and database applications in the medical electronics industry. She can be reached out at theresak[at]us.ibm.com
Everybody is talking about metrics today. Numbers have always been important to builders, financial wizards, and politicians. If you were a software developer though, you may have taken a lot of math courses but you didn’t really use that math directly. So, why the sudden interest in software and metrics or specifically in requirements and metrics?
Metrics are important whenever you need accuracy such as building a house or sewing a dress. They are important when you need to work efficiently and you can’t afford waste. In today’s business environment, every organization out there, whether a finance company, government agency, or healthcare organization needs to work more effectively and produce more data, more projects, more software with far fewer resources. Numbers, or metrics, is the way to do this. Since requirements are the foundation of software, it is one of the best places to apply metrics. By applying measurements to metrics, we get insight into the organization’s requirements activities. We find out how much progress is being made, whether we have gaps in the downstream deliverable related to requirements and how big our project risk might be when changes occur to those requirements. Applying measurements to requirements also allows us to make continuous improvements.
So if metrics are so valuable, why aren’t more people collecting them? There are primarily three reasons
- It often takes too long to collect the metrics. By the time you end up gathering all the information you need, it’s out of date already.
- How do you interpret the information that you collect? Some people feel that if you don’t have access to a person with a PhD, you’ll never understand the numbers.
- Even if you collect the data and you correctly interpret it, is the organization actually going to do anything with that data to actually make improvements?
The best way to collect metrics, of course, is to do it automatically as the team goes through their daily activities. As they define and refine requirements, process requirement reviews, perform change management and quality management activities, data can be collected automatically, processed, and made available in real-time. This is exactly what an integrated solution like Jazz platform
does. It basically removes this obstacle and makes data collection, processing and display of data a natural part of the project team’s daily activity.
Interpretation of Data
Understanding data and resultant reports does take some skill. But it is not complicated. I always start by articulating exactly what information I want to view or what questions I need answered. Then I look for the data that will help me answer those questions. For example, you might want to which high priority requirements have no test cases associated with them. One of the capabilities with the Jazz suite that helps here is that the reports have a section called “What does this report tell me?”. When you run a report, you actually have some automatic interpretation built in. Of course, I always recommend people start simply. Your skills will grow as you get more comfortable understanding and interpreting the data. Jazz also allows you to set up filters which help you answer specific questions at the click of a button. This makes analysis automatic and very, very easy.
Improvement Actions for Continuous Improvement
Identifying and implementing actions is one of the most important aspects of ensuring continuous improvement. It’s very important when analyzing data that indicates weaknesses in your process to actually identify improvement actions. Not only do you need to identify how you will improve your project and processes you follow, but you also need to assign owners and a date for implementation to make sure you do actually make improvements.
I hope I’ve convinced you that metrics are important – well worth the time and effort it takes to collect and analyze data. By identifying and implementing improvement actions your organization can make considerable progress in working more efficiently and really achieving the ability to do more with less.
Anthony Kesterton is a Technical Consultant in the Financial Services Sector for IBM Rational in the United Kingdom. He has a wide experience in IT development including many years teaching, mentoring and using various IBM Rational tools, as both an end-user and as an IBM employee. He is a regular contributor on the forums on jazz.net. He is co-author of the IBM Redbook “Building SOA Solutions Using the Rational SDP”. As an IBM Community volunteer, he works on programmes dedicated to encouraging children to take an interest in Science, Technology, Engineering and Mathematics (STEM) at school. He also regularly writes in IBM Technical Field Professionals' blog here. Anthony can be reached out at akesterton[at]uk.ibm.com
One of the more interesting discussions I have had about requirements was how to indicate that a requirement must be implemented in the final system. The business wanted to make every requirement “mandatory” which gives no indication to anyone which requirements are really important. Eventually, the business analysts compromised and had levels of “mandatoriness” (I think I might have invented a new term) starting with “least mandatory” to “most mandatory”. The business was happy to accept this. To me, this showed the importance gaining agreement on the requirement attributes. It should not stop at attributes – attribute values, requirements types, traceability and document templates are also very important for a project. All this information should be part of a Requirements Management Plan.
A Requirements Management Plan has nothing to do with time and resources on the project but everything to do with structure of the requirements. A Requirements Management Plan is a way to capture the kind of requirements that will be used on the project, their attributes and other useful information. A typical Requirements Management Plan should contain the following:
- Types of requirements: For example, business requirements, system requirements, and performance requirements.
- Attributes: Important information associated with each requirement type, for example: priority, source of the requirement, or the stability of the requirement
- Attribute values and the meaning of these values: Each attribute should have a range of possible values, and these must be agreed upon and documented. For example, an attribute for priority may have a value from 1 to 5, where 1 means the highest priority and 5 means the lowest.
- Traceability between requirements: There is usually a hierarchy of requirement types. This hierarchy needs to be captured and explained. For example, a feature of the system may link to a software requirement, with the feature at the top of this hierarchy.
- Document templates: Many organizations have standard ways they present information in document form. They might have a Software Requirements Specification with a specific format and structure.
Having the discussion about this plan, and getting agreement on the content of the plan can be enlightening. It can uncover what kind of information is important to the project, how the project plans to manage the requirements (via the attributes), and most importantly, what kinds requirements where overlooked. While the plan is being discussed, be prepared for heated debates about potentially every aspect of the plan.
Spend time creating a Requirements Management Plan for your project as soon as possible. Be prepared for changes to that plan over time as the project works out the really useful attributes or even requirement types. Most importantly, get agreement on the Requirements Management Plan – it really does help a project build requirements that clarify rather than obscure the intention of a project.