On Thursday 14 March I presented on an IBM sponsored Dr Dobbs webcast on the topic of ‘3 Reasons to Throw Away Your Requirements Documents’. If you didn’t catch the webcast, you can view it on-demand and download the slides. If you did attend I hope you enjoyed it and in this blog post I want to answer some of the questions that I didn't get time to cover in the webcast, so scroll down to see if I've covered your question here. But first, for those of you who didn't make it, here's a quick recap of points I made during the webcast:
- What do I mean by ‘Throw Away Your Requirements Documents’? I’m not saying do away with your requirements or your requirements management process. What am I saying is invest in improving your requirements management process and support those process improvements with the right tooling. In the webcast audience poll, 60% said they are using documents and/or spreadsheets for requirements, so I set out to provide the case for why they should consider moving to a requirements management tool.
- Reason #1 – Collaboration. Using documents and spreadsheets, you can get into all sorts of problems like working from the wrong version of requirements. An integrated requirements management tool (one that has a collaborative repository that has open interfaces to share requirements data with design, test and development environments) helps you ensure that all engineering / development team members and their stakeholders are working from the right version and view of requirements for their role. I shared information from a case study on MBDA Missile Systems where collaboration was a major challenge.
- Reason #2 – Traceability. Traceability helps you get context and an audit trail for decisions made, perform coverage analysis, detect gold plating and perform impact analysis (See blog posts ‘What is Traceability?’ and ‘The uses and value of traceability’ for more explanation). But using documents and spreadsheets to create and manage traceability gives it a bad name – it becomes a tedious, error-prone overhead activity. With an integrated requirements management tool traceability links can be easily created and navigated. Traceability becomes part of the process rather than a bulk catch-up exercise, and through open interfaces it becomes easy to extend traceability from requirements to artifacts in other tools such as designs, work items and test cases. And most importantly there are automated views and reports that enable you to make active use of traceability for the purposes I mentioned above. I shared information from a case study on Invensys Rail Dimetronic where traceability is essential.
- Reason #3 – Agility. As many engineering and development organizations are looking to improve time to market and reduce costs, agile approaches are becoming increasingly popular (in the webcast audience poll 56% said they were using some sort of hybrid waterfall/iterative approach and 20% were agile), and with that comes changes to the way requirements might have been traditionally defined and most importantly to the way that changes to requirements are managed. Change is allowed and encouraged but you must have reviews and impact analysis to make informed decisions about whether to accept the change and implement in the current iteration or sprint, plan it into a future iteration or reject the change altogether. And to do that requires more than just a backlog managed in a spreadsheet or an agile planning tool. At IBM Rational we keep a prioritized backlog of user stories and epics as work items in plans managed in Rational Team Concert. User stories and epics are then decomposed and more fully described by requirements and supporting visual and textual artifacts created and managed in Rational DOORS Next Generation.
- And I started and concluded by looking at the most compelling reason for improving your requirements management process with the support of the right tooling: Return on Investment. I shared information from a case study on Emerging Health, Montefiore IT where they a achieved, among other benefits, a 69% reduction in the cost of quality (test preparation, testing and rework) within 6 months of deploying an improved requirements management process supported by IBM Rational DOORS. There’s also a follow-up video to this case study that was recorded after the client had adopted agile development practices.
Ok now onto some of the questions that were submitted but I didn’t time to answer during the webcast. I’ve divided them into sections based on the type of question so you can easily scan down to topics you’re interested in:
Alternative approaches to managing requirements
Q. What's the advantage of a requirements management tool if I could link my change management tickets to fine grained requirements (use cases, storyboards) that are maintained on a wiki or other collaboration tool (e.g. Sharepoint or IBM Connections)? It would seem that the work items would give you the needed traceability and metrics?
A. While you might be able to create the level of traceability you require, how would you report on it? Would you have to build bespoke views and reports? And would the use of a wiki for ‘fine grained requirements’ provide you with a view of requirements in context with one another as opposed to individual wiki pages? Where would you document additional properties of requirements? Could you easily reuse requirements across projects? A requirements management tool like IBM Rational DOORS Next Generation provides built-in traceability views and reports, enables to you to structure requirements in context with another in document-like views, provides user-defined properties for recording additional information and facilitates reuse of requirements.
Requirements Management and Agile Development
Q. We are mostly using water fall model, but looking into checking if Agile works for our customers. I am very interested in the role of a Requirements Analyst in the Agile world. In our world, the analyst is a facilitator of requirements gathering and not a SME on the business application we are gathering requirements for.
A. That facilitation role and the analysis skills of the Requirements or Business Analyst are still essential in Agile development. A common mistake when moving to moving to more agile approaches is appointing a ‘Product Owner’, who is a subject matter expert in the business domain you are building an application or product for, and expecting them to write the requirements or ‘user stories’. While they are experts in the business and you need that expertise on hand for Agile to work, they are not usually skilled in getting to the root goals or needs of the business problems you’re looking to address with the product or application. Without the skills of the Requirements or Business Analyst, it’s all too easy for the user stories to become about automating the way things are done today rather than addressing the real business issues in an optimal way. You can read more about the role of the analyst in Agile development in an interview with Mary Gorman of EBG Consulting.
Q. How does the IBM requirements toolset compare to more "agile" focused toolsets like Atlassian, Rally, etc.?
A. IBM is very committed to supporting agile development, particularly when agile is scaled to large, distributed development teams. At the heart of our agile development capabilities is Rational Team Concert which supports agile planning, task management and change management. As stated in the webcast though, we’ve found in our own continuous delivery process and heard from clients, that a place is needed to capture more details of requirements and their associated properties, in context with one another, together with supporting artifacts like storyboards, workflow scenarios and use cases. The integration of Rational Team Concert with Rational DOORS Next Generation provides those additional capabilities while preserving traceability to user stories and epics managed in the product backlog.
IBM Requirements Management solutions
Q. It seems this discussion is on Rational DOORS. Is Rational Requisite Pro still offered? If so then how do these two products compare?
A. IBM continues to support, maintain and respond to enhancement requests for Requisite Pro, but our future direction for requirements management for IT application development lies with Rational Requirements Composer and we provide migration support and a trade-up program. Please contact your IBM representative for more details. If you are using Requisite Pro for requirements management for complex products or embedded systems development, then you might also want to look at whether Rational DOORS or Rational DOORS Next Generation is the right move forward for your organization. But don’t worry we’re not forcing you to migrate today.
Q. You've mentioned DOORS Next Generation in your presentation, what is that? Does it replace DOORS?
A. IBM Rational DOORS Next Generation is a requirements management application on a collaborative lifecycle management platform for systems and software engineering that provides requirements collaboration, planning, reuse and lifecycle traceability. DOORS Next Generation (DOORS NG) was introduced in 2012 to take advantage of the common, collaborative ‘Jazz’ platform shared by Rational Team Concert, Rational Quality Manager, Rational Design Manager and Rational Requirements Composer; and to extend the requirements management capabilities in Requirements Composer to meet the needs of product & systems development organizations developing complex and/or embedded systems. DOORS NG will enable IBM to introduce new capabilities, faster, that would have been more difficult to deliver with existing DOORS product. However DOORS NG does not replace DOORS today. We have a very large install base of DOORS users working on programs that can last tens of years, and we will continue to support, maintain and enhance the DOORS 9.x series to meet the needs of those users. We encourage existing DOORS users to take a look at DOORS NG, to try it out on pilot projects and use the interoperability capabilities to exchange and/or link data with DOORS 9.x. Existing DOORS customers with active support & subscription are entitled to use DOORS NG without requiring an additional purchase – you can use your existing DOORS license entitlements to use with either DOORS or DOORS NG or a combination of the two. But we are not telling existing DOORS users to migrate live projects to DOORS NG today. DOORS NG in it’s early releases is attractive to organizations who don’t currently use a requirements management tool and are looking for a web based solution that also offers common platform integration with change management, agile planning, test management and design management capabilities. You can download a trial and follow development plans for DOORS NG on Jazz.net.
Q. How does Rational DOORS Next Generation compare to Rational Requirements Composer?
A. DOORS Next Generation is based on Rational Requirements Composer but extended to meet the requirements management needs of product & systems development organizations developing complex and/or embedded systems. In the first releases of DOORS NG, the web client is identical to Requirements Composer, but DOORS NG also features a rich client which is designed for usability of editing large requirements specifications. The strategy for the two products is that while Requirements Composer will be focused on the needs of business analysis and IT application development teams, DOORS NG will be focused on the needs of systems engineers and product & systems development teams.
I’d like to wrap up this post by thanking all of you who attended the webcast, participated in the polls, asked some great questions and completed the survey feedback. If you missed it, you can catch a replay or download the slides. I’d welcome any additional comments or questions here.
Today we have with us, Theresa Kratschmer writing about the importance of metrics in requirements management. Theresa Kratschmer is a senior software engineer who joined IBM T.J. Watson Research in 1996. There she worked on defect analysis, requirements, and Orthogonal Defect Classification deployment. In 2010, Theresa moved to sales where she is a technical specialist focusing on the Rational Jazz products. Previous to joining IBM, Theresa developed software for real-time, graphics, and database applications in the medical electronics industry. She can be reached out at theresak[at]us.ibm.com
Everybody is talking about metrics today. Numbers have always been important to builders, financial wizards, and politicians. If you were a software developer though, you may have taken a lot of math courses but you didn’t really use that math directly. So, why the sudden interest in software and metrics or specifically in requirements and metrics?
Metrics are important whenever you need accuracy such as building a house or sewing a dress. They are important when you need to work efficiently and you can’t afford waste. In today’s business environment, every organization out there, whether a finance company, government agency, or healthcare organization needs to work more effectively and produce more data, more projects, more software with far fewer resources. Numbers, or metrics, is the way to do this. Since requirements are the foundation of software, it is one of the best places to apply metrics. By applying measurements to metrics, we get insight into the organization’s requirements activities. We find out how much progress is being made, whether we have gaps in the downstream deliverable related to requirements and how big our project risk might be when changes occur to those requirements. Applying measurements to requirements also allows us to make continuous improvements.
So if metrics are so valuable, why aren’t more people collecting them? There are primarily three reasons
- It often takes too long to collect the metrics. By the time you end up gathering all the information you need, it’s out of date already.
- How do you interpret the information that you collect? Some people feel that if you don’t have access to a person with a PhD, you’ll never understand the numbers.
- Even if you collect the data and you correctly interpret it, is the organization actually going to do anything with that data to actually make improvements?
The best way to collect metrics, of course, is to do it automatically as the team goes through their daily activities. As they define and refine requirements, process requirement reviews, perform change management and quality management activities, data can be collected automatically, processed, and made available in real-time. This is exactly what an integrated solution like Jazz platform
does. It basically removes this obstacle and makes data collection, processing and display of data a natural part of the project team’s daily activity.
Interpretation of Data
Understanding data and resultant reports does take some skill. But it is not complicated. I always start by articulating exactly what information I want to view or what questions I need answered. Then I look for the data that will help me answer those questions. For example, you might want to which high priority requirements have no test cases associated with them. One of the capabilities with the Jazz suite that helps here is that the reports have a section called “What does this report tell me?”. When you run a report, you actually have some automatic interpretation built in. Of course, I always recommend people start simply. Your skills will grow as you get more comfortable understanding and interpreting the data. Jazz also allows you to set up filters which help you answer specific questions at the click of a button. This makes analysis automatic and very, very easy.
Improvement Actions for Continuous Improvement
Identifying and implementing actions is one of the most important aspects of ensuring continuous improvement. It’s very important when analyzing data that indicates weaknesses in your process to actually identify improvement actions. Not only do you need to identify how you will improve your project and processes you follow, but you also need to assign owners and a date for implementation to make sure you do actually make improvements.
I hope I’ve convinced you that metrics are important – well worth the time and effort it takes to collect and analyze data. By identifying and implementing improvement actions your organization can make considerable progress in working more efficiently and really achieving the ability to do more with less.
Anthony Kesterton is a Technical Consultant in the Financial Services Sector for IBM Rational in the United Kingdom. He has a wide experience in IT development including many years teaching, mentoring and using various IBM Rational tools, as both an end-user and as an IBM employee. He is a regular contributor on the forums on jazz.net. He is co-author of the IBM Redbook “Building SOA Solutions Using the Rational SDP”. As an IBM Community volunteer, he works on programmes dedicated to encouraging children to take an interest in Science, Technology, Engineering and Mathematics (STEM) at school. He also regularly writes in IBM Technical Field Professionals' blog here. Anthony can be reached out at akesterton[at]uk.ibm.com
One of the more interesting discussions I have had about requirements was how to indicate that a requirement must be implemented in the final system. The business wanted to make every requirement “mandatory” which gives no indication to anyone which requirements are really important. Eventually, the business analysts compromised and had levels of “mandatoriness” (I think I might have invented a new term) starting with “least mandatory” to “most mandatory”. The business was happy to accept this. To me, this showed the importance gaining agreement on the requirement attributes. It should not stop at attributes – attribute values, requirements types, traceability and document templates are also very important for a project. All this information should be part of a Requirements Management Plan.
A Requirements Management Plan has nothing to do with time and resources on the project but everything to do with structure of the requirements. A Requirements Management Plan is a way to capture the kind of requirements that will be used on the project, their attributes and other useful information. A typical Requirements Management Plan should contain the following:
- Types of requirements: For example, business requirements, system requirements, and performance requirements.
- Attributes: Important information associated with each requirement type, for example: priority, source of the requirement, or the stability of the requirement
- Attribute values and the meaning of these values: Each attribute should have a range of possible values, and these must be agreed upon and documented. For example, an attribute for priority may have a value from 1 to 5, where 1 means the highest priority and 5 means the lowest.
- Traceability between requirements: There is usually a hierarchy of requirement types. This hierarchy needs to be captured and explained. For example, a feature of the system may link to a software requirement, with the feature at the top of this hierarchy.
- Document templates: Many organizations have standard ways they present information in document form. They might have a Software Requirements Specification with a specific format and structure.
Having the discussion about this plan, and getting agreement on the content of the plan can be enlightening. It can uncover what kind of information is important to the project, how the project plans to manage the requirements (via the attributes), and most importantly, what kinds requirements where overlooked. While the plan is being discussed, be prepared for heated debates about potentially every aspect of the plan.
Spend time creating a Requirements Management Plan for your project as soon as possible. Be prepared for changes to that plan over time as the project works out the really useful attributes or even requirement types. Most importantly, get agreement on the Requirements Management Plan – it really does help a project build requirements that clarify rather than obscure the intention of a project.
Note: This is the sixth post in our series of Managing Your Requirements 101. Read the first five posts here:
- What is requirements management and why is it important?
- How to write good requirements and types of requirements
- Why baseline your requirements?
- What is traceability?
- The uses and value of traceability
Requirements elicitation is the process of discovering and clarifying the needs, capabilities, conditions, and constraints that a project must satisfy to deliver a solution or product that meets the client/market needs. Requirements elicitation is by far the significant activity taken by a business analyst or requirements engineer. Elicitation in a traditional world begins early in the cycle – consulting or development while determining the scope and objectives of the project. Generally it extends to the analysis and design phases. We will delve in detail how this differs in agile in a later post. In either case, elicitation is an iterative and ongoing process which takes care of clarifying, refining requirements and identifying constraints and new changes. The amount of elicitation of requirements depends on where you are with the engagement lifecycle. Elicitation happens at various levels - early on to draw down the initial requirements and later on to refine them to specifics.
So what are the sources of requirements? The main sources of requirements are the stakeholders themselves. The end users, SMEs in the domain are also prominent sources for clarifying requirements. Other sources could be the regulatory requirements, existing documents, past experience and business cases.
Essentially, the requirements elicitation starts with a method adoption workshop. A Method Adoption Workshop (MAW) helps in determining the key requirements activities, templates and outputs. The workshop essentially acts as a discussion venue to determine the optimal activities for the project. Various methods can be used for these workshops. Most of them prescribe how to collect and specify requirements and in some cases how to manage them also. Some of the most used methods are SE&A Method and Custom Development Method or proprietary ones like SAP. These workshops also help in putting in place effective project change management processes. Thus effective scope management is the combination of requirements management and change management. MAW is used to tailor and adopt the project management and technical methods for the product and determine the optimal activities needed. Post MAW, stakeholders are identified and various elicitation techniques are used for determining the requirements. Techniques can vary according to the group of stakeholders. A consolidated business requirements document is created for client review.
Understanding the project environment is the key to determining the requirements. This could act as the starting point of elicitation. INCOSE Systems Engineering Handbook provides a classification of various sources of requirements - External environment (like the regulations, laws, culture, and competition), Enterprise environment (internal policies, technology), Project environment (budget, tools, project management) and Support functions in the organization. Before looking into the various elicitation techniques, some of the factors to consider are - the triple constraint - cost, schedule and scope; stakeholder influence, stakeholder access and willingness to participate, contractual deliverables and organizational experience in similar projects.
Broadly we could categorize the elicitation techniques into Structured techniques, Analytical techniques and Interactive techniques. I believe the genesis of structured techniques is from marketers. Generally these techniques include interviews, focus group discussions, surveys and workshops. Analytical techniques could include interface analysis, domain experience, and market research. Interactive techniques could include brainstorming sessions, prototyping, observation and reverse engineering. Many text books have dealt in detail these techniques and when to use them; however Unearthing Business Requirements: Elicitation Tools and Techniques by Kathleen and Rosemary deals with this topic in detail. We will look into some of the techniques in detail in a later post and also touch base of when to use which techniques.
Some useful resources
Form feeds function: The role of storyboards in requirements elicitation
Requirements Elicitation Introduction, Nancy R. Mead, Software Engineering Institute
Modified on by VijaySankar
Note: This is the fifth post in our series of Managing Your Requirements 101. Read the first four posts here:
Part 4 of this series ‘What is Traceability’ looked at the definition of requirements traceability and different types of traceability relationships. In this part, let’s look at what traceability can be used for and where it delivers value to application, system or product development.
Why is traceability necessary/important? Isn’t traceability just an overhead, an onerous documentation task that’s only done in industries where it’s mandated? In my view it’s true that requirements traceability practices have originated from industries like aerospace & defense, where one use of traceability is to show that contractual requirements have been addressed, but it also has so much more value to bring when used effectively, and you have the right tools to maintain it and report on it. The following lists some of the ways that traceability delivers value:
Context: If you can trace back from a design or test to a user requirement, you then have the reason for the existence of that design or test and through the information in the user requirement (and through any related requirements you can trace to), you have more supporting context to help create the optimal design to meet the requirement or the most effective test to verify that the requirement is met
Audit trail & compliance: Taking an example of a new person joining a project, traceability can help them navigate the project and see why particular requirements, designs, tests, etc. exist. This is also of upmost value and importance when you need to demonstrate compliance to a regulation or standard to an auditor – the traceability trail can help you quickly show you are addressing the regulation or standard.
Coverage: How do you know whether you’ve covered all the user requirements in the derived systems requirements, designs, tests, etc.? Traceability can help you here – you can see gaps indicated by where a higher level artifact doesn’t have any relationships to lower level artifacts. Of course even if a relationship exists you still need to follow it and examine the lower level artifact to ensure it does what the traceability relationship says it should, but at least you had a signpost to direct you to the right place to look.
Gold plating: As well as highlighting coverage gaps, traceability can help you investigate possible ‘gold plating’ or over-engineering. If you have a lower level element without any relationships to a higher level then you can ask the question – why does this exist if it’s not apparently satisfying a requirement? There might be a perfectly valid reason but this gives the opportunity to identify and eliminate anything that could bring unnecessary additional time and cost to the project.
Impact analysis: I think this is the most valuable reason for traceability. If you have traceability relationships in place you can following those relationships when say a user requirement changes, to identify all the related system/sub-system/component requirements, design elements, tests, work items, etc. that are potentially impacted by the change. This will enable you to fully scope out the impact of the change before it’s made (if you decide to proceed), giving you far more control over the cost and time impact of change requests. This can of course also work in reverse – if a design change is necessary, say because the original design proves infeasible, you can more easily see what impact that has if any on your ability to still meet the requirements. Both of these scenarios have great project management benefits – you can have informed discussions in the development team and with your customer/stakeholders about whether to make the change.
What’s that I hear again – what about agile? Aren’t traceability and the benefits it’s proposed to have only necessary/of value in waterfall development? Well, take another look at that list of benefit scenarios – don’t you always want to be able to do these things, regardless of development methodology? If you’re in a fast changing, evolving project don’t you need to be more informed, have the right information at your fingertips, in order that you can respond quickly but effectively? I argue that if you have the right tools in place to create and utilize traceability, that it is even more essential if you’re adopting agile practices.
Speaking of the right tools, tools can automate the various types of traceability reporting and analysis I described above. For impact analysis, tools can quickly display all of the related artifacts connected to a particular requirement. And more than that they can detect changes in a requirement that has links and automatically mark those links as suspicious. This is very effective where you have different people working on different parts of the application/system and you need to be aware of changes in other areas that might impact your work. A suspect link indicates that a change has been made to the artifact at either end of the link and that you should check that the link still holds true – i.e. if the user requirement has changed, does the linked system requirement still satisfy it? This proactive impact notification mechanism helps to avoid inconsistencies across your specifications.
So you have my views on the uses and value of traceability, but what do you think? Please use the comment function to leave feedback and additional ideas.
I’ll leave you with a couple of links that have additional discussion of the value of traceability and in particular how much traceability is enough:
This is the fifth part of our six part blog posts series on basics of requirements management. Read the remaining parts here -
1. What is requirements management and why is it important?
2. How to write good requirements and types of requirements
3. Why base line your requirements?
4. What is Traceability?
5. The uses and value of traceability
6. Revisiting Requirements Elicitation
CLM 4.0.1 has enhanced its higher enterprise development support with server renames and clustering. CLM 4.0.1 is now supported in Max OS X and also Safari and Chrome supports. Continuing with improving the flexibility when it comes to licensing, we introduced two varieties - CLM Contributor and Practitioner licenses. In 4.0.1, we have extended the workgroup licenses to RRC & RQM also. For a detailed review contact us or your respective account rep.
The biggest addition to RRC 4.0.1 is the inclusion of modules. So what are modules?
Modules helps us in organizing the requirements into ordered lists and use hierarchy to group them. This is in addition to the collections we have presently. This helps Business Analysts to increase control for requirements by group and give teams’ additional meaning, awareness and understanding. Reporting of requirements could be improved significantly with documents with the templates provided.Here is a detailed video
explaining modules in Requirements Composer.
The ability of comparing collections is also now included in RRC 4.0.1. This helps the business analysts to discover differences in project information progress quicker, learn what has changed and communicate across the team.
Ability to better control lock in a multi user environment is improved with an automatic edit lock option. The main utilities of this feature are to lock a requirement artifact while you are making changes or to apply permanent locks to restrict for security/regulation purposes.Here is a video showcasing the feature in detail
We had introduced significant changes based on ReqIF in RRC 4.0. In this release, further improvements are made in terms of the ability to import the requirements into different projects. This enables for example of data transfer between DOORS & RRC; take the data offline to work on etc. We intend to improve the options available for offline data usage in the future releases.
Another significant improvement in RRC 4.0.1 is the RRC-HP Quality Center integration. Now we can directly preview QC tests in RRC. This helps in use of requirements definition and management capability from RRC to manage project requirements being tested using HPQC. The following screenshot showcases a customer using Requirements Composer to view various requirements collections. The user rests his pointer (“hovers”) over one of those collections and a pop-up window appears showing a preview of what’s contained in a test plan that’s being managed by HP Quality Center.
RRC 4.0.1 enables bi-directional connectivity from requirements to models/elements using Rational Software Architect (RSA), Rhapsody, and Design Manger. This seamless integration helps you to use requirements and models together to design and document the project needs.
To read more about enhancements in RRC 4.0.1, visit jazz.net.
To know more about Rational Requirements Composer, visit here.
Modified on by VijaySankar
Note: This is the fourth post in our series of Managing Your Requirements 101. Read the first three posts here:
What is traceability? Or more specifically what is requirements traceability? Well rather than repeat what is already a good collection of definitions, I’ll refer you to http://en.wikipedia.org/wiki/Requirements_traceability
. From there I’d summarize three elements to requirements traceability:
Following the life of a requirement – from idea to implementation
How requirements impact each other, and how requirements impact other development lifecycle artifacts (such as designs, tests, tasks, source code, hardware specs, etc.) and vice versa.
The decomposition of requirements – from high level user/customer/market needs to system, sub-system, software or hardware component requirements; and transformation into design specifications and the implementation realization of the requirement.
Traceability in this context is about relationships between requirements at the same or different levels of detail, and between requirements and other lifecycle artifacts as listed above. It also extends to relationships beyond those directly involving requirements – i.e. the relationship of a defect report to a test case – this is referred to as ‘lifecycle traceability’. Traceability relationships can be of multiple types, for example:
Satisfaction: a system requirement (or more likely a number of system requirements) ‘satisfies’ a user requirement e.g. system requirement ‘The engine shall have at least 200bhp’ satisfies user requirement ‘The car shall be capable of accelerating from 0-60mph in under 8 seconds’.
Verification: a test case ‘verifies’ a requirement e.g. test case ‘0-60mph acceleration test’ (consisting of a number of test steps) verifies user requirement ‘The car shall be capable of accelerating from 0-60mph in under 8 seconds’.
Dependency (often used where interfaces are concerned): a requirement ‘depends’ on another requirement e.g. requirement ‘the power socket shall take 3 pins’ depends on requirement ‘the plug shall have 3 pins’.
Basic traceability establishes a relationship or link between one or more elements. Typed traceability adds the relationship type with its associated semantics (examples above). Rich traceability (ref: Requirements Engineering, Hull, Jackson & Dick, Springer, 2004) adds additional information on the traceability relationship such as the rationale explaining why a group of systems requirements satisfies a particular user requirements; or as is often the case, you can’t be 100% certain on specification or design decisions, you might document any assumptions you made in deriving a set of systems requirements from a user requirement. The rich traceability approach is particularly valuable in heavily regulated industries and safety-critical systems where audit trails of decisions made are vitally important to provide assurance and reduce risks.
Once traceability has been established there are multiple ways in which it can be viewed and reported on. Perhaps the oldest and most commonly recognized method is the traceability matrix where you can see the intersection between two sets of requirements and a check or cross shows where a link exists. This method doesn’t scale particularly well since the matrix could become very large. It’s also sometimes used for creating the links, but it’s not ideal for that either since you can typically only see a small amount of information on the requirements.
Another way to see traceability is to pick a starting point, e.g. the user requirements and display the related systems requirements alongside the user requirement they are linked to, in a traceability column. You can typically choose how much detail of the linked requirement is displayed, and you can even make it recursive, going down as many levels of requirements as you need/is practical to manage in a single view.
Graphical displays are great for getting a bigger picture view of traceability rather than immediately focusing in on the details of particular relationship. You can explore the traceability tree, zooming in/out or collapsing/expanding parts of the tree, or changing the focus (starting point) of the tree.
But what about in agile development, I hear you cry? Well that could be another topic in its own right – watch this space - but relationships still exist between typical artifacts created in agile approaches (such as between product features and user stories), and I argue that as long as traceability is created ‘as you go’ and automated by tools as much as is practical, that it’s even more essential to stay informed when changes are happening rapidly and ensure you are looking at the correct versions of related artifacts.
In a follow-on post in this Requirements 101 series, I’ll take a look at what traceability can be used for – highlighting where its application can bring significant value to your projects. But for now I’ll leave you with a few resources below that I’d recommend you take a look at, and ask you to let me know if you think this post was useful (or not!) and provide any feedback or additional information using the comment function.
Modified on by VijaySankar
Note: This is the third post in our series of Managing Your Requirements 101. Read the first two posts here -
Usually projects start with unclear requirements and expectations. Lack of base lined requirements can result in chaos with lots of requirements changes resulting in requirements and scope creeps. Baselines can also help in acceptance testing and prototyping efforts. Baselines are especially valuable in fixed price contracts.
A baseline is all about getting to a common base agreement between stakeholders. It essentially involves setting the right expectations including responsibilities, risks, assumptions, deliverable and approaches. Once an agreement is reached; it could be put in source control to manage the base line going forward.
Why bother base lining requirements? As mentioned in earlier posts, requirements are the foundation stones to a project and unless we know what we are creating; how do we know what changes to make in due course? Starting the projects without a proper analysis of requirements is a recipe for disaster - it’s like building a house without a blue print. When it comes to software projects, lack of base lines can incentivize clients to make endless changes while the project is in progress and resulting in requirements and scope creeps. Requirements must be initially base-lined and put under change control in the Statement of Work (SOW) so that the project can be planned, estimated and executed. When it comes to a requirements management tool like Rational DOORS or Rational Requirements Composer, a requirements project baseline captures the entire project at a specific moment in time including folder structures and artifacts. Baselines also play a significant role in enabling traceability. It provides the foundation linkages to establish the traceability matrix later in the project.
What to be included in a baseline? Though the contents of a baseline can vary; it is essentially provides the functional and nonfunctional requirements taken into consideration for a release or an iteration. It may contain other aspects like sub system and hardware dependencies also. It is also important to note here that requirements baselines evolve over time. The Business Analyst or Project Manager concerned takes the call on creating new baselines as requirements change or new requirements pops up. As mentioned above, a requirements baseline essentially captures the entire state of a project as t a given point in time. Essentially this includes the vision/scope document, glossary of terms, use case (stories). The starting point for not resulting in requirements creep is setting the boundaries.
Ideal time for base-lining. Baselines drive formal change controls. A project manager is always trying to address the triple constraint – scope, time and cost (coined by Kathy Schwalbe) . Baselines help in managing the scope constraint and focus on other aspects. Baselines also pave way to setting the schedules. Karl E. Wiegers in his book (More About Software Requirements) provides an exhaustive list of factors to be considered before defining a requirements baseline.
What do you think about baselines?
This is the third part of our six part blog posts series on basics of requirements management. Read the remaining parts here -
1. What is requirements management and why is it important?
2. How to write good requirements and types of requirements
3. Why base line your requirements?
4. What is Traceability?
5. The uses and value of traceability
6. Revisiting Requirements Elicitation
Modified on by VijaySankar
This is the second post in our series – Managing Your Requirements 101 – A Refresher. Read the first part here
Requirements are the starting point for everything - project scoping, cost estimation, scheduling, coding, testing, and release management. If the requirements captured doesn’t serve the purpose that are supposed to do; there is hardly any benefit in spending time and money behind it. I remember in Innovate 2012 at Orlando, Arnold Flores from Raytheon speaking about five common traps that results in ineffective and non-verifiable defects - Not understanding requirements, Not having continuous dialogue with stakeholders, Not getting consensus, Not involving other disciplines early and Not limiting scope and requirements creep. Thus good quality requirements should
Establish a common understanding between the project sponsor, customers, developers and other stakeholders
Improve customer confidence in the products to be delivered
Provide a roadmap to development
So how to get the right requirements and make sure they are of quality ones. Some of the official channels that Business Analyst try to get requirements of a client could be interaction with end users or sponsors, business cases, request for proposals, regulations and so on. In a seminal article titled Writing good requirements is a lot like writing good code , Jim Heuman, a Requirements Management evangelist talks in detail about how to write good quality requirements. A lot of articles are available in public domain that talks about how to write good requirements. Essentially the core principles revolve around a requirement being Simple, Verifiable, Necessary, Achievable, and Traceable. We will discuss some of the techniques used to capture requirements later. Provided below are some of the resources that will help you in writing good requirements.
A Business Analyst’s soft skills are equally important to succeed in this endeavor. I am sure you must have seen a similar picture that shows how to what extend understanding the requirements of a customer can go wrong. Some of the common mistakes a business analyst make while requirements gathering and analysis are incorrect assumptions, not using the correct level of abstraction, contradicting and inconsistency between requirements and finally over specifying requirements to a spec level. Even using simple language, avoiding generic phrases and using correct grammar becomes handy while writing good requirements.
In our earlier post, we defined requirements as a condition or capability needed by a user to solve a problem or achieve an objective. Often the client provides a high level requirement in the form of a need. These needs, expectations and concepts must be identified, analyzed and elaborated into a set of business requirements. Key requirements in this set should be traced back to the business case provided relating to the need and client's vision.
At a broad level, requirements could be divided into functional and non-functional requirements. Functional requirements provide the high level description of how a system of product should function from the end user's perspective. It provides the essential details of the system for both business and technical stakeholders. Expectations are expressed and managed using functional requirements. Some of the key aspects functional requirements try to address are for whom the product is built, how is it expected to be used, what are the interactions and any guidelines to be followed. Non-functional requirements represent mainly the qualities (expectations and characteristics of the system) and constraints (for example Governmental regulations).
When it compared to requirements levels - we can start with the Requirements Pyramid as shown below(from Requirements Management Using IBM Rational RequisitePro by Peter Zielczynski
). It essentially starts from the stakeholder needs at the top to the test cases that verify the implemented requirements at the bottom.
A requirements plan captures all these information. For a template, click here
This is the second part of our six part blog posts series on basics of requirements management. Read the remaining parts here -
1. What is requirements management and why is it important?
2. How to write good requirements and types of requirements
3. Why base line your requirements?
4. What is Traceability?
5. The uses and value of traceability
6. Revisiting Requirements Elicitation
Modified on by VijaySankar
The world of requirements management has developed significantly in the last decade or so and has increasingly become one of the corner stones of successful software and systems engineering projects. We have been discussing various aspects of the domain from a best practices perspective and how tools can help managing your requirements efficiently and effectively.
Starting today we will discuss various aspects of the requirements management discipline at a bird’s eye view level. These are meant to be introductory in nature and also intend to serve as refreshers for those who are already in the field. The domain and best practices have developed to an enormous level of sophistication that; it is difficult to cover everything in a set of blog posts. However we intend to make these posts as a quick reference and starting point for you to think seriously about the domain.
Have you heard about the Gaudi’s unfinished Cathedral or Airane 5 explosion? The former one is a hundred year project still under progress which couldn’t be finished because of unclear and changing requirements
and the latter one resulted in over $7 billion loss when the rocket exploded on its first voyage due to a software error; specifically floating point number error
. The importance of requirements management can be established from three unique perspectives – project overshoot and thus missing the market opportunity due to unclear and changing requirements; project failures due to unmet or misunderstood requirements and finally cost burden due to errors and missed requirements found late in the development cycle.
In a classic IEEE Spectrum article, Robert N. Charette writes about Why Software Fails
. Among the top reasons for failure of software projects are poor definition of requirements, poor management of risk, communication failure among stakeholders and increasing complexity of projects. In IBM GBS, ineffective requirements managements in one among the top five reasons for troubled projects. Many research firms (Standish Group’s CHAOS report, Gartner, CMU-SEI) and academicians (A Davis, Robert B.Grady, Steve Easterbrook
) have studied and quantified the failure rates of software projects (for example, in the above IEEE article Robert opines that 40-50% of software development time is spent on rework and cost of fixing a bug in the field can be as high as 100 times compared to when fixed at development stage). In all of them, the preliminary reasons for failures or overshoots are ineffective management of requirements.
So what exactly is requirements management?
Before moving to requirements management, let’s understand what a requirement is? A requirement can be anything from an abstract need to a well drilled down implementation detail of a system. Essentially it can be considered the detailed view of a need under consideration. IEEE Standard Glossary of Software Engineering Terminology
defines a requirement as a condition or capability needed by a user to solve a problem or achieve an objective; or a condition or capability that must be met or possessed by a system or system component to satisfy a contract, standard, specification, or other formally imposed documents; or a documented representation of a condition or capability as in former two. Thus what a requirement essentially represents depends on to whom we are talking to – it could be the need to a client; a business requirement for customers; a system requirement for vendors or a specification for a developer and tester. We will come to the different types of requirements later. Requirements Management can be considered the management of requirements essentially from when a customer provides the needs or a product development process is started. It includes managing the definition, elaboration and changing requirements during the development cycle and systems development. Peter Zielczynski, a requirements management expert defines the following major steps in requirements management (Requirements Management Using IBM® Rational® RequisitePro®, Peter Zielczynski
Establishing a requirements management plan
Developing the Vision document
Creating use cases
Creating test cases from use cases
Creating test cases from the supplementary specification
Zave (Classification of Research Efforts in Requirements Engineering. ACM Computing Surveys (1997)) defines Requirements Engineering as “the branch of software engineering concerned with the real-world goals for, functions of, and constraints on software systems. It is also concerned with the relationship of these factors to precise specifications of software behavior, and to their evolution over time and across software families.” While in practical terms, this could be considered same as requirements management, we can say requirements engineering addresses various aspects of requirements development; requirements management is the set of processes in systems and software engineering that interfaces with requirements engineering. We will try to delve into more details in another post when we consider V&V (Verification & Validation) model.
This is the first part of our six part blog posts series on basics of requirements management. Read the remaining parts here -
1. What is requirements management and why is it important?
2. How to write good requirements and types of requirements
3. Why base line your requirements?
4. What is Traceability?
5. The uses and value of traceability
6. Revisiting Requirements Elicitation
As another year comes to an end, we wish you all a happy and prosperous new year!
2012 was an eventful year with major releases for both Requirements Composer
. We announced the next generation requirements management solution for complex & embedded systems from IBM - Rational DOORS Next Generation
We moved to deverloperWorks platform for our blog. We believe our readers found this blog useful. Please share with us your comments and suggestions for the blog content and also any specific topics in requirements management
that you want us to focus on in 2013....
Last but not least, as we mentioned in the last post, call for speakers in now open for Innovate 2013 - The IBM Technical Summit
. Submit your abstracts before January 14 2013 to stand a chance of presenting in a conference with 4000+ footfalls and a free conference pass!Happy New Year!
Innovate 2013 – The IBM Technical Summit is here. The 2013 event promises to be even more exciting with top-notch keynotes, over 450 breakout sessions, labs, certifications and our biggest exhibit hall ever. As in previous events, Requirements Management is one of the key areas of interest at Innovate which attracts speakers and attendees from across the globe representing a wide range of industries. In 2012, we had two tracks for Requirements Management with sixteen sessions each with one track focusing on IT and another focusing on Systems Engineering. We had 14 real life case studies, 2 panel discussions and 4 instructor led sessions.
Managing requirements has always been a cornerstone in both software and systems development. The importance of the discipline continues to grow and is expected to take a leading role in the coming years. This is an opportunity to showcase your thoughts on the discipline, and how requirements management tools like DOORS or Requirements Composer can aid in managing effectively the requirements for project successes. Here are some of the topics from last year and an expected list of topics
- Requirements Management in Agile Projects
- Requirements Management for Mobile Development
- Managing requirements in developing Safety Critical Systems
- Developing and managing requirements specifications for contract agreement
- Requirements Driven Development: Understanding requirements and work items
- Requirements engineering and supporting layered requirements and models
- Delivering a specification perfect requirements set (document generation)
- Requirements Reuse: Methods and best practice
- Requirements management for complex systems and teams
- Using traceability to expose gaps/change to other requirements and across the lifecycle
- Requirements engineering for projects with complex systems and software
- Requirements definition and management case studies
- Requirements definition and management across the software lifecycle
- Elicitation techniques for requirements and use cases
- Agile software development and requirements modeling
- Requirements management for outsourced projects
- Defining and managing requirements across geographically distributed teams
- Metrics and analysis used in requirements management
- Integrating requirements with project and portfolio management
- Implications of regulatory compliance on the requirements management process
- Business specification-centric approaches
- Best practices in aligning business goals and IT
- Value-based requirements engineering
- Business modeling in requirements definition
- Requirements prioritization best practices and choosing your methodology
- Incorporating industry standards as reusable requirements
- Effective reporting using requirements and CLM information
- DOORS, Requirements Composer and other Rational products best practices
- Requirements engineering and product lifecycle management
Some session topics from Innovate 2012
- Iterative Requirements Analysis: Implementing Lean and Agile principles for Software Requirements Analysis (Nationwide Case Study)
- Visual definition in the requirements lifecycle: a conceptual framework
- How IBM Rational DOORS Helps JPL Get to Mars and Beyond: Best Practices in Metrics, Verification and Traceability
- Integrating IBM Rational DOORS with IBM Rational Team Concert – Lessons Learned at Raytheon
- Integrating Requirements and Models with IBM Rational DOORS and IBM Rational Rhapsody: Lessons Learned at Lockheed Martin MS2
- Writing Verifiable Requirements Is Not Easy
Share your experience, thoughts and best practices on requirements at an event attended by industry experts and IBM core development teams. Here are the top three reasons on why you should submit your paper for Innovate 2013.Explore
new areas - Free conference pass opening up the doors to 450+ sessions, labs and demo boothsNetwork
with experts and peers - Over 4000 professionals expected to attend the eventSharpen
your technical know-how - Learn from product and domain experts and from IBM core developers
As we reach the close of a year and move into a new one,
it's often time to take stock of what we've been doing and making plans for the
New Year. We look at what we've been doing right and what we could change and
improve. So since this is a requirements management blog I thought it would
worth posing the question and giving an opinion on whether requirements
management (in the domain of systems and product development - my focus area)
is still relevant today and as we move into 2013?
I'm not really sure when requirements management as a formally
recognized discipline can be said to have come into being, but I do believe that
it really started to take shape in the early 90's, primarily based on work
coming out of the aerospace industry, and that's when commercial specialized
tools for requirements management, such as IBM Rational DOORS (then known
simply as DOORS from a company called QSS), first emerged. In 2005, I was leading
a team working on a campaign to promote the value of requirements management to
a wider audience than a core set of requirements specialists. We declared
2005 as the 'Year of Requirements Management' because of its increased
recognition as a discipline and the emergence of greater
tool capabilities for making requirements more easily accessible to a wider set
So as we move towards 2013, is requirements management still
as relevant? Do we still have further to go on becoming more effective at it?
In a recent Aberdeen Group report ‘Managing Systems Design Complexity: 3 Tips to Save Time’ by Michelle Boucher, where a survey of the effectiveness of systems engineering
capabilities of system and product development organizations is reported, two
of the three key recommendations made are directly related to requirements
management in the areas of visual requirements definition and requirements
traceability. In the other recommendation on improving change management across
engineering disciplines, Michelle says that impact analysis is core to such
improvement and that’s enabled by requirements management and traceability. From
study, a clear link can be shown between more effective requirements management
and traceability to business benefits such as reduced cycle times, improved
quality and increased product revenues. I also recently heard from another
analyst that one of the key challenges they are hearing from product
development organizations is getting a better handle on interrelationships
between requirements across engineering disciplines, so they can respond more
effectively to changes.
So my answer to the question I posed of is requirements
management still relevant is a resounding YES! We’ve made significant progress
but complexity of the systems we build has also increased and we need to keep
pace with changes in practices and technologies, so I expect effective
requirements management to remain a cornerstone of successful product
development and for practices and supporting tooling to continue to evolve.
But what you do think? Will requirements management be as
important in the future? How will it/should it change?
Last week the UK chapter of INCOSE (International Council on Systems Engineering) held their annual systems engineering conference on the Warwick University campus. I'd like to share some of what I heard during the conference, both on systems engineering in general, and more specifically on requirements management practices in the systems engineering domain.
One of the keynote speakers was Dr Sandy Wilson, President & Managing Director, General Dynamics UK. Dr Wilson spoke about the key challenges in the defense industry - the rate of change in threats and technology and the need to lower costs. He challenged the V model - said it's a nice diagram but its linearity is an issue - the world is not linear or rigid but the SE V diagram is. He spoke about the need for the defense industry to become more agile but that today change is cumbersome due to contractual issues and governance constraints. There are two main types of defense procurement done in the UK - the longer term needs are met by EPs (Equipment Programmes) and the urgent tactical needs by UORs (Urgent Operational Requirements). The former is bogged down in top level scrutiny and check boxes. The latter is helped by the top level sense of urgency and support. An example of a UOR was the decision to implement the multinational no-fly zone over Libya. Dr Wilson proposed that all defense projects should become more like UORs - more agile. He said that "an 80% solution delivered 1 year earlier is better than 90% delivered 4 years late". I heard that delivering incremental capability needs asset management and tracking, configuration management and a more agile approach to systems engineering - valuing "Product over Process". As well as changes in the way companies deliver capabilities, a change is needed in the way the customer (governments) do their acquisition and contracts in order to enable more agility.
Dr Jeremy Dick of Integrate Systems Engineering
and co-author of the book 'Requirements Engineering' presented a case study in the aerospace industry on developing the assurance case for a (safety) critical system in parallel with requirements analysis, design, verification & validation, using an extension of his technique for documenting the rationale for traceability relationships known as 'rich traceability'. In addition to developing a requirements 'flow-down' (through levels of requirements to design), the 'evidence' supporting the flow-down is documented. The evidence in the early stages can be how you expect the lower level requirements or design elements to satisfy the higher level and your evidence to suggest that your argument is sound. In parallel your verification & validation strategies should be evolved, including an argument and supporting evidence for how the test(s) will prove the requirement(s) is/are met. Jeremy was asked how the textual requirements, arguments and evidence would fit with a MBSE (Model-Based Systems Engineering) approach. Jeremy answered that he favours (and in fact came up with the concept of - ref: "The Systems Engineering Sandwich: Combining Requirements, Models and Design", Jeremy Dick, Jonathon Chard, INCOSE International Symposium, Toulouse, July 2004) the sandwich model - interleaved layers of requirements and modeling used to decompose a system specification adn design (you can read more on that concept in the post 'Food for thought: The Systems Engineering Club Sandwich'
Chris Rolison, CEO, Comply Serve
, continued the theme of progressive assurance with focus on the rail industry. Chris highlighted the complexity challenges in major rail infrastructure projects, and the issues presented by paper-based systems, silos in organization structures, and the supply chain. Chris said that "up to 80% of the engineering requirements can change during design & build" - not because the customer changes their mind but because of all the external factors involved in building a rail system. Chris went onto describe a more collaborative, requirements-driven design approach where systems engineering principles are applied, supported by a collaborative platform (ComplyPro which is based on IBM Rational DOORS).
Alastair Mavin of Rolls Royce 'lent' us his EARS (Easy Approach to Requirements Syntax
(link is to an IEEE publication - sign in required) an application of a template with an underlying rule set on how to describe requirements using natural language but in a more structured, consistent way. He described the latest version of the template EARS+ (or as he nicknamed it 'Big EARS' !) and the benefits of the approach - simplicity and structure combined.
I could go on for pages about all of the great content shared at this excellent event but I'll leave it there with the main requirements related topics, except to quote from the keynote speaker on day 2: "The core of Systems Engineering is defining requirements and delivering against them". I'd put it this way - you can't have successful systems engineering without effective requirements management.
Neal Middlemore has over 14 years of experience in requirements management, this encompasses the associated disciplines of change management of requirements and validation/verification. Neal comes from an avionics systems engineering background and has been working with the DOORS product for over 10 years.
One of the most fundamental benefits that businesses want to get from using requirements management tools is consistent traceability. It doesn’t matter if it is an IT system being developed or an aircraft carrier, the levels of complexity being dealt with determine that traceability across multiple levels of requirements, from stakeholder requests to detailed implementation, is not simple to maintain manually.
Further hurdles are put in our way by the need to comply with legislative requirements, so many different industries these days have requirements placed upon them by government and international standards bodies along with internal corporate standards.
To prove compliance with these legislative rules it is necessary for projects to not only prove that requirements are being managed but also to provide the how and where of their management. What features relate which stakeholder request? What aspects of the solution satisfy safety regulations? Has the realization of the requirement been tested and by whom on what platform?
To answer these questions it is necessary to utilize the traceability capabilities provided by modern tools but it’s not enough to let a tool decide how things relate to each other, it isn’t enough to let individual users decide these relationships. Traceability needs to be considered in the larger context of how you report on it to answer the very questions that led you to consider traceability for in the first place. It’s more than ‘does object A relate to object B?’.
An initiative to define an information architecture of your governance solution will assist in defining your process artifacts and their relationships to one another. Traceability becomes an asset rather than an overhead. A good way to begin such an initiative is having a workshop attended by all project stakeholders i.e. those who have a vested interest in ensuring project success. Most likely these attendees will have a good understanding of the process and what the project needs to deliver. Undoubtedly each will also bring differing perspectives and understanding of what information is required.
The test manager may want to see traceability from individual test artifacts through to the requirements being tested or to determine regulatory compliance has been achieved and whether verification methods have been agreed.
A subject matter expert (SME) may want to see the design in the context of the system level requirements and how it relates to stakeholder requirements.
Everybody wants to see something that is applicable to their job roles, even wishing to see things outside of their typical discipline domains. By asking the questions and documenting the answers you can start to put together an information architecture that makes sense for your project. It’s likely that many information architectures will exist. Not every project is the same and these will have differing sets of required attributes, views and reports of project information and agreed structure of inter artifact relationships.
Modern tools can often create templates for this kind of information to allow the deployment of additional artifact containers as and when required. These can even enforce traceability to ensure the integrity across the project. Above all, it is vital that the needs of the project is documented and communicated alongside of the information architecture to all project members.
Last week I was really fortunate to spend a couple of days in London presenting to and talking to clients, business partners and industry analysts. It's always so good to hear what's really going on out there and to get many different perspectives on what's important today and for the future. The first day was at IBM's Innovate UK 2012 event where I was fortunate to be asked to present on all the really exciting new stuff we've done in the past year to help organizations building today's and the next generation of smarter products and systems, with particular focus on providing solutions for systems engineers and embedded software developers. You can catch the absolute latest news on our recent launch webpages
. That session included a whistle-stop tour of the developments in requirements management for complex systems with Rational DOORS 9.4 and our plans for DOORS Next Generation. Whistle-stop because we also had so much news to get through in architecture & design, planning, change & configuration management and quality management, as well as industry specific solutions for A&D, automotive, medical devices and electronic design. And because on the following day at IBM's Southbank facility we had a whole day dedicated to topics related to DOORS.
At the DOORS customer day we had attendees from across several industry sectors including transportation, aerospace & defense, banking & mail services. The day kicked off with Morgan Brown presenting the latest on IBM's requirements management and DOORS strategy. Morgan told us how the DOORS 9.x series is and will continue to be developed and enhanced to meet the needs of the large install base, in parallel with the introduction of DOORS Next Generation (DOORS NG). DOORS NG is planned to take the best paradigms for managing structured requirements from DOORS 9.x and marry those with the requirements management and team collaboration capabilities that have been developed on the Jazz collaborative lifecycle management platform over the last 4 or so years (and are in use in the form of Rational Requirements Composer). The development of DOORS NG is out in the open on jazz.net
where milestone builds can be downloaded, discussions held, defects/enhancements raised and plans viewed. DOORS NG has gone through four beta releases and is expected to be released in late November. Morgan explained that in its first release, DOORS NG is not intended to replace the DOORS 9.x product line, but it is expected that existing DOORS customers will try out DOORS NG on pilot and new projects, and will use the interoperability capabilities of the ReqIF data exchange and cross-tool traceability linking to exchange and/or link data between DOORS 9.x and DOORS NG. DOORS NG will also appeal to those looking for a requirements management tool that is on an integrated platform with design, test management and task/change management capabilities. Morgan reminded the audience of an IBM statement released earlier this year that existing DOORS customers with active support & subscription would be entitled to use both DOORS 9 and next generation capabilities when they become available. This was well received by the attendees since it means that they can try out DOORS NG when it ships without the need for an additional purchase.
Of course a day of technology insights never goes past without some piece of tech throwing an unexpected spanner in the works. This time it was the projector and the next presenter's Apple Mac that refused to talk to each other, so instead of a flow into a demo of DOORS NG, next up was Neal Middlemore to tell us about the improved integration of requirements and quality management with DOORS 9.4 and Rational Quality Manager
(RQM) 4.0. This release was a significant enhancement that brings the integration in line with IBM's strategy to support OSLC - Open Services for Lifecycle Collaboration
. OSLC is a new approach to tool integration that is open and vendor neutral. What's really different about OSLC is that data no longer needs to be copied or synchronized between tools in order to create cross-tool or cross-discipline visibility or relationships. So now quality professionals working in RQM can see requirements in DOORS and create links between test cases (and now because some organizations require it, test steps) and the requirements they are validating; and requirements professionals in DOORS can see linked test case information including test results, without the need for either to leave the comfort of their familiar tool or for data to be copied between the two tools. Neal demonstrated the value of the integration to requirements & quality professionals and showed how RQM can be used to manage manual testing or hook up with a number of IBM and partner solutions for various forms of test automation. You can also see a demo of the DOORS - RQM integration on YouTube
So, technical issue solved, it was back to Jon Walton to give a demo of DOORS Next Generation using the Beta 4 release. Jon spent most of his time in the web client, highlighting the support for key DOORS paradigms such as hierarchical structured requirements documents, and showed off the plethora of new capabilities provided by the Jazz platform such as database-wide requirement reuse, graphical traceability view, requirements definition techniques (use case diagrams, storyboards), cross-discipline dashboards (containing requirements project info mashed up with info from design, quality and task management) and task management. Jon also showed the desktop client of DOORS NG which is very familiar looking to existing DOORS users with some twists (reuse of requirements across documents for one) - the desktop client will primarily be for users who need to do extensive editing of large requirements documents. If you're currently using DOORS 9.x, this YouTube video gives a quick preview intro
of DOORS NG and how it's both similar to and different to DOORS 9.x. Watch this space for more to come on DOORS NG later this month.
Back to the earlier lifecycle integration theme started by Neal, next to present was Steve Rooks on how to use DOORS with IBM's solution for model-based systems engineering and model-driven embedded software development, Rational Rhapsody, to link requirements and design activities. Rational Rhapsody enables elaboration of requirements and construction of systems and software architectures using SysML and UML. Rhapsody Design Manager
provides an additional level of design collaboration capabilities. Models can be published to and/or stored and managed in a central repository, making them more easily accessible to a wider set of stakeholders so that designs can be better communicated and understood by all those involved in specifying, designing, building and validating a product or system. Rhapsody Design Manager uses OSLC to facilitate linking of design elements to other lifecycle artifacts - requirements, test cases, work items, etc. Like with the DOORS-RQM scenario described above, a systems engineer or software architect working in Rhapsody can see requirements in DOORS and easily create links between requirements and design model elements. Requirements and requirement links can even be included in model diagrams. And of course a DOORS user can see links to design elements without leaving DOORS or to participate in design reviews can navigate into Rhapsody Design Manager. You can read more about linking requirements and design and the DOORS-Rhapsody Design Manager integration in my recent post 'The House That Paul Built'
that talks about a recent webcast
on the topic.
After lunch, an IBM business partner Kovair
was invited to present on how their Kovair Omnibus solution provides bridges, synchronization and workflow support across multiple tools from multiple vendors. It's a common situation to find yourself trying to enact processes and workflows when you have a diverse set of tools. Kovair talked about their support for OSLC to be able to widen the number of tools they can help link together, but also highlighted scenarios where you would still want to copy or transform data between tools - it's not a choice of Link or Sync, it's Link and Sync as appropriate.
The next session was presented by Paul Fechtelkotter, market manager for energy & utilities at IBM Rational. Paul gave a really interesting presentation on the challenges of complex systems development for nuclear power plants and how the nuclear industry is now adopting systems engineering best practices starting with requirements management to enable them to get better change management, traceability, impact analysis and compliance support. You can learn more about how IBM Rational is helping the nuclear industry on our dedicated web page
Unfortunately I had to leave after Paul's session and didn't catch the remainder of the afternoon, but as you can see it was a day packed full of information. I hope you find my summary and links for more information useful. If you have any questions or comments on any of the topics I've covered here or indeed anything on IBM's requirements management strategy, Rational DOORS and the lifecycle integrations, please don't be afraid to ask by using the blog comments facility.
Eric has worked in the software development industry for over 20 years and is co-author of UML for Database Design and UML for Mere Mortals both published by Addison Wesley. Eric is currently responsible for capabilities marketing of Rational’s application lifecycle management solutions including Agile Software Delivery, Quality & Test Management, Requirements Management and Collaborative Lifecycle Management. He rejoined IBM in 2008 as the team leader for InfoSphere Optim Solutions and later was responsible for Information Governance Solutions. Prior to rejoining IBM, he worked for Ivar Jacobson Consulting as VP of Sales and Marketing. Before joining Ivar Jacobson, he was director of product marketing for CAST Software. Previously working for IBM, Eric held several roles within the Rational Software group including program director for business, industry and technical solutions, product manager for Rational Rose and team market manager for Rational Desktop Product. He also spent several years with Logic Works Inc. (Acquired by Platinum Technologies and CA), as product manager for ERwin.
As I think about IT today, there comes a rebirth in some ways of the importance of architecture and requirements. We are in an era of “ANY” -- meaning that applications and data can be accessed from anywhere, by anyone, and at any time.
Looking back at the applications of yesteryear (two or three years ago), we didn’t expect much from the web or mobile-based applications. We could view, run some reports or do some basic tasks, but to do the real work, we needed to go to the fat-client. Now, in today’s era of any, the user interface may look different, but the capabilities had better be the same since we expect near full capabilities no matter our device or interface.
This puts a new found set of requirements on applications and their development, and is making modeling and requirements (analysis and design) relevant again, but with a new twist – AGILITY. It is no longer a question of “what platform am I developing for” – the question is how quickly can we get it up and running on the latest version of Apple, Android, HTML 5 and whatever other platforms our clients expect the application to run on … and it had better run on all of the latest versions, with no delays, when updated operation systems come out.
And the question that I often receive now, however, is “can I be agile and meet these needs at the same time”? The plain answer is, yes, you can. However, agility doesn’t me you cannot ignore requirements and design. I am not talking about write-once, run-anywhere, rather instead understand the true requirements so that the various development teams can articulate them in code brought to life as features for the users, as they expect to see them. Users are looking for the application to be specific to their hardware/OS (iPad/AppleOS, Droid/Android…) as the hardware has become the platform for not just running the application, but the expected look, feel and usability of it now, too. This often means different developers for different deployment platforms, certainly at the User Interface level.
Designing applications requires that we are prepared. Architectures must be solidified and communicated. Requirements must be consistent and shared. We must model architectures so that developers can build to the designs and not recreate their own, wasting time and resources, and we must share those designs across the team.
Does this get in the way of agility? NO, it will speed agility. By sharing designs, assigning tasks based on architecture needs, we can speed time to market and our ability to deliver high quality software. In the era of any, we may have multiple teams working on the same front-end capabilities for different platforms even though the back-end is the same. But the more they can share, the faster they can be deployed and having the right requirements from users, the more satisfied they will be. We see people changing their desired platform as employers, vendors and suppliers change requirements, so we need to be prepared for the customer who is using an iPad today to be using an Android device tomorrow with the same requirements on the application. Just look at how the world of Blackberry has evolved.
So, as you think about your next project, don’t skimp on requirements and architectures or you may be limiting your agility in the future rather than speeding your time to satisfied clients.
Nowadays, software is present everywhere and software projects are becoming complex in terms of scope, time and cost. Associated with such a change increases the potential failure rate of software projects. How can these potential failures be avoided? While a guarantee may not be possible, adequate investments in managing the risk of failure can be provided. A typical textbook definition of software risk management is the identification of risks, analysis of identified risks and establishing plans to address those risks. The important goal of risk management is to avoid the occurrence of such risks. Similar to requirements management, risk management needs to be started early in the development life cycle process.
ISO/IEC 16085:2006 defines risk as a combination of the probability of an event and its consequence. What are the major sources of risks in a software project? An obvious answer to that question today would be the prevailing uncertainty added by time and budget pressures. Inaccurate requirements capture, is another important reason for increased risks in the later stages of the life cycle. Boehm
has done some phenomenal work in managing risks in software projects. He essentially identifies ten risk aspects – Personal shortfalls, unrealistic schedules & budgets, development risks (building wrong functions, properties or user interfaces), adding unnecessary features, continuing requirements changes, shortfalls (in externally furnished components & performed tasks), performance shortfalls and technological strains.
So how do you best manage the risks?
– Boehm divides the first level of activities into Assessment and Control. Assessment essentially contains identifying the risks, analyzing the identified risks and finally prioritizing the risks. Control aspect deals with planning, resolution of identified risks and monitoring. If you consider the Top 10 items he has identified, requirements mismatches, requirements changes and architecture performance & quality are among the top. Various techniques are discussed in Risk Management literature which is beyond the scope of this blog post. These techniques involve basic ones like maintaining a risk register to decision tree analysis
, to risk exposure profiling. Murray Cantor, a Distinguished Engineer at IBM regularly writes about risks in his blog here
What are some of the generic strategies to managing risks? – The predominant method is to buy more information; for example if you are in the early development cycle, you could always try prototyping to make sure you and your client are on the same page of understanding. This also helps in revealing the possible root causes of risks. Other options are to avoid the risk by de-scoping requirements, transferring it (for example outsourcing the component to an expert vendor or a sub-contractor), have mitigation plans or as the last option, accept the risk and have a Plan B. ISO 31000:2009, a relatively new standard introduced in 2009 related to risk management, provides a generic framework for a risk management process which a team can consider implementing.
How can tools help manage the risks?
Risk includes both opportunities and threats - that is a risk can have both a positive and negative effect. Tools help in implementing an integrated risk process that enables maximization of value creation resulting in faster time to markets and improved productivity, at the same time avoiding the threats of cost and time over run and project closures. Tools can help significantly in two ways - conducting the qualitative and quantities risk analysis activities and actually implementing the outcomes for managing risks. Check this case study of Chubb Insurance
that manages effectively its risk using IBM Rational Focal Point. And finally here is a developerWorks article on how to calculate your return on investment for software and systems
You've bought the plot of land for your dream home. You have your list of requirements - 4 bedrooms, 3 bathrooms, spacious kitchen, 2 living rooms, 2 garages, landscaped gardens, etc. Would you be happy to simply hand that list to the builders and let them start work? Unlikely, I think. Typically, you call in an architect, who can take your quantitative requirements and qualitative desires and produce a blueprint, the architectural design that incorporates your wishes where feasible and adds creative flourish based on the architect's knowledge of house design. The blueprint enables you and the builders to have a much clearer picture of the desired end result than that original list of requirements. And it affords you the opportunity to influence the architecture, and for the builders to question and look at feasibility & cost options, before the foundations are dug and the first bricks are laid.
The same applies in product development. Systems engineers who are responsible for the holistic product specification and design don't just use textual requirements lists to capture the problem domain and describe the proposed solution. They analyze the requirements, identifying integrated scenarios, and often depict those using modeling languages such as UML or SysML. These modeled scenarios are easier to discuss and review with all stakeholders, and as the systems engineer evolves the proposed architecture (also in the same modeling language) they can run the scenarios against the architecture in model simulations to find inconsistencies or gaps in the requirements and flaws in the design, long before any software is coded, circuit boards are soldered or metal is welded.
So what value are our textual requirements lists - should we throw them away in favor of models? Well, not everything can be expressed in the model and not everyone involved in a development effort maybe using models. Going back to the house building analogy, there are contracts, numerous standards and regulations to be adhered to, and simply details that would make the blueprints unreadable. The various contractors (and I know from recent experience that sub-contracting is the name of the game in house building these days!) involved in the building process need to ensure that they can meet the contractual and regulatory demands while delivering against the architecture. Again this is the same in product development, except in many cases, particularly safety-critical systems, traceability and demonstration of conformance to requirements and compliance to standards & regulations are demanded. This requires the ability to integrate requirements and modeling workflows, easily link requirements and design elements, and to report on that linked information.
The need and solutions for this capability are nothing new. Integrations between requirements management and modeling tools have existed for many years (I think I started using such an integration in the early 90's and I'm sure they preceded that time). But I know from first hand experience of using and indeed writing such integrations that they've not always been optimal in the way integration is performed and in the workflow that is supported. Typically it's meant synchronizing (i.e. copying) data between tools in order to create the traceability links in one of the tools. This brings up all sorts of issues like 'which tool is the master?', 'am I looking at the latest data?', 'what happens when information is deleted?', etc.
With Open Services for Lifecycle Collaboration (OSLC
) we now have a much better way to link data across product development and operations tools, even when the tools maybe from different vendors, open source or in-house. OSLC has learnt from the principles of the World Wide Web and enables
tool data to be shared and linked where it resides (called a ‘Linked
Data’ approach). OSLC provides a common vocabulary for ‘resources’ in
particular domains, i.e. what a requirement, test case, design element,
change request, work item, etc. looks like, so that regardless of tool,
technology or vendor, tools implementing OSLC specifications can share and link data.
With Rational DOORS 9.4 and Rational Rhapsody 8.0 with Design Manager 4.0, IBM is utilizing OSLC to provide a simplified workflow for linking requirements analysis and design. On September 20, Paul Urban (if you've been wondering about this blog post title, now you know the Paul I'm speaking of), Market Manager for IBM Rational Rhapsody, presented this simplified workflow and its benefits on a IEEE Spectrum webcast sponsored by IBM. You can watch and listen to the replay at your own leisure here
. I hope you it enjoy it - please let Paul and I know what you think by leaving feedback on this blog post.
The importance of communication and collaboration in developing and managing good requirements were discussed in our earlier post on How to enable effective requirements communication and collaboration
. In this guest blog post, Melissa Robinson - a Senior Technical Specialist at IBM writes about how Rational DOORS addresses this aspects with Discussions. Melissa started her career at Telelogic enabling Product Management with technical support around requirements management. Melissa spent 3 years supporting clients getting started with Requirements management at Telelogic. After IBM acquired Telelogic in 2008, Melissa transitioned roles to support clients with Enteprise Architecture initiatives. She received the Carnegie Mellon certification in Enterprise Architecture in 2008 and is TOGAF certified. Melissa now supports clients getting started with evaluating and implementing both requirements management and enterprise architecture solutions.
Note: Please click on the screenshots for a better view
Why did we make this decision? Who made this decision? Who approved this requirement?
These are some of the questions we can help answer with effective collaboration messaging with DOORS. Collaboration messaging is now enabled in DOORS and DOORS Web Access (DWA) with the addition of DOORS Discussions. Discussions allow users to contribute and add comments to requirement objects or requirement modules, users can even add comments to base-lined requirements. Discussions offer a method of having a conversation on requirements. DOORS discussions really break the communication barrier by allowing users to easily make comments or start a discussion on any requirement, including read-only requirements. Discussions can be created in DOORS or DWA and viewed in both DOORS and DWA. Both DWA Editor and DWA Reviewer roles can contribute to Discussions. Discussions capture comments so that you can later review ancillary information about your requirements. Discussions allow everyone to contribute comments and provide a full understanding of requirements.
Here is a simple scenario for using DOORS Discussions. A DWA Reviewer user creates a Discussion on a requirement. A DOORS user then reviews this comment and contributes a comment on the requirement. The DWA user reviews the latest comment and closes the Discussion.
A DOORS user, Susan, reviews the current Discussion created by a DWA Reader user, Kavita. Susan can open the requirements module with a pre-created Discussion view to review the Discussions. Below Susan reviews the Discussion on Requirement AMR-STK-66.
Susan can contribute a comment to the open Discussion.
Kavita reviews the new Discussion comment in DWA. Notice that Kavita is a Reviewer in DWA. As a Reviewer, she can create and add comments to Discussions. Kavita can also close Discussions that she started. Later, Kavita can contribute another comment to the open Discussion.
As the person who first opened the Discussion, Kavita can close this Discussion.Later, in DOORS, Susan can review the latest status of the Discussion using the Discussion Thread view. As a Database Manager role, Susan can choose to re-open the closed Discussion at any time.
Discussions open up the communication thread between several different types of DOORS users. Discussions allow requirements reviewers to exchange views and comments about the content of a requirements module or the content of a requirement object in a module.
We believe the post gave you a sneak preview of how DOORS Discussions help in effectively collaborating and communicating between various stakeholders during requirements management. Feel free to contact melissarobinson[at]us.ibm.com if you have any queries about the topic. Melissa will be discussing the topic in detail in an upcoming webcast on October 5, 2012. Don't miss the opportunity to watch the action live.
Register now @ http://bit.ly/DOORS_Discussions