In my career I’ve been deeply involved with both modeling and requirements management disciplines and tools, so it always intrigues me when I hear debates over whether largely textual based (sometimes referred to as ‘traditional’ or ‘document-based’) or model-based approaches to defining and managing requirements are the right way to go.
We’ve all heard the argument that a picture paints a thousand words, but I’ve always vividly remembered something I heard at a conference some years ago which was “I’d have taken a 1000 words over this one unreadable diagram.”
My belief is that it is not an either-or decision. You need both. Models can add clarity to requirements specifications and can bring together a more holistic understanding of what’s expressed in the requirements. Models can be walked through with stakeholders and with the right language and tools (like SysML or UML in IBM Rational Rhapsody), they can even be run to validate that what is captured in the model is correct, consistent and complete. But what if you have contractual requirements to manage, documents of regulations or standards to comply with, or complex performance or availability constraints – you don’t want to clutter your model with so much detail that it becomes unusable.
My preference is for a combination of textual requirements and models, that can be described by the ‘Systems Engineering Club Sandwich’ (references 1&2) where textual requirements, which form the layers of bread - and maybe a bit dry on their own, are supplemented by models that form the layers of filling – they are richer and more expressive, together forming a tasty combination to help explore and elaborate requirements, perform decomposition and allocation, and maintain traceability. I recently got together with my colleague Paul Urban to record a 30 minute webcast entitled ‘The Tasty Way to Tackle Complexity - The Systems Engineering Club Sandwich of Requirements & Models’
where we take a look at some engineering challenges, where requirements work goes wrong, how the club sandwich approach works and how to use requirements and models together effectively. So if this hors d'oeuvre has made you hungry for more, please take a look. Paul and I are really interested to hear what you think.
1. "The Systems Engineering Sandwich: Combining Requirements, Models and
Design", Jeremy Dick, Jonathon Chard, INCOSE International Symposium,
Toulouse, July 2004.
2. Requirements Engineering, Hull, Jackson
& Dick, Springer 2004.
As we reach the close of a year and move into a new one,
it's often time to take stock of what we've been doing and making plans for the
New Year. We look at what we've been doing right and what we could change and
improve. So since this is a requirements management blog I thought it would
worth posing the question and giving an opinion on whether requirements
management (in the domain of systems and product development - my focus area)
is still relevant today and as we move into 2013?
I'm not really sure when requirements management as a formally
recognized discipline can be said to have come into being, but I do believe that
it really started to take shape in the early 90's, primarily based on work
coming out of the aerospace industry, and that's when commercial specialized
tools for requirements management, such as IBM Rational DOORS (then known
simply as DOORS from a company called QSS), first emerged. In 2005, I was leading
a team working on a campaign to promote the value of requirements management to
a wider audience than a core set of requirements specialists. We declared
2005 as the 'Year of Requirements Management' because of its increased
recognition as a discipline and the emergence of greater
tool capabilities for making requirements more easily accessible to a wider set
So as we move towards 2013, is requirements management still
as relevant? Do we still have further to go on becoming more effective at it?
In a recent Aberdeen Group report ‘Managing Systems Design Complexity: 3 Tips to Save Time’ by Michelle Boucher, where a survey of the effectiveness of systems engineering
capabilities of system and product development organizations is reported, two
of the three key recommendations made are directly related to requirements
management in the areas of visual requirements definition and requirements
traceability. In the other recommendation on improving change management across
engineering disciplines, Michelle says that impact analysis is core to such
improvement and that’s enabled by requirements management and traceability. From
study, a clear link can be shown between more effective requirements management
and traceability to business benefits such as reduced cycle times, improved
quality and increased product revenues. I also recently heard from another
analyst that one of the key challenges they are hearing from product
development organizations is getting a better handle on interrelationships
between requirements across engineering disciplines, so they can respond more
effectively to changes.
So my answer to the question I posed of is requirements
management still relevant is a resounding YES! We’ve made significant progress
but complexity of the systems we build has also increased and we need to keep
pace with changes in practices and technologies, so I expect effective
requirements management to remain a cornerstone of successful product
development and for practices and supporting tooling to continue to evolve.
But what you do think? Will requirements management be as
important in the future? How will it/should it change?
I was lucky enough last week to travel to the INCOSE
(International Council on Systems Engineering
) International Symposium 2012
near Rome, Italy.
An excellent opportunity to meet the systems engineering community and hear
about their interests and concerns. We had lots of traffic to the very stylish
IBM booth where we talked about the IBM Rational solutions for systems
engineering and the latest from IBM Research on tool interoperability and
design optimization & trade-off. I’d like to claim the traffic was to due
to my presence but in fact there was lots of excitement and interest in the
must have giveaway of the conference, the IBM Limited Edition of Systems
Engineering for Dummies book
(if you weren’t there and don’t have a copy, you
can download a PDF version
Being at the INCOSE event reminded me of the very active and
I recently provoked on the INCOSE LinkedIn group with the posting of the link to my
previous blog post ‘Traceability – How Much is Enough?’
It’s a great read with some very provocative statements about whether
traceability is at all useful and that it’s the root cause of failure on
projects that overrun and overspend versus those that say it’s absolutely vital
on safety-critical systems or where the project is contract-driven. In the end I
think some consensus was reached between these two camps that ‘just enough’
traceability to keep a project on track, provide customer/market need context
to engineers, facilitate impact analysis, and (if needed) to meet industry
standards and regulations, is sufficient. Any more is excessive and wasteful
and likely to bog down progress towards to delivering innovative products and
During a quiet time at the IBM booth, I also had chance to
chat with my colleague Brian Nolan (marketing manager for aerospace &
defense industry at IBM Rational) about effective traceability, since Brian
is very interested in this topic and has presented on a Dr Dobbs
webcast on ‘3 Ways to Improve Traceability and Impact Analysis’.
Brian believes in what I would describe as ‘traceability by design’, meaning
that traceability is automatically established while you decompose your system
design (for example, use case to use case realization to sequence diagram and
so on). This discussion also reminded me of what another colleague Greg Gorman
(program director for IBM systems and software engineering solutions and the
INCOSE Corporate Advisory Board member from IBM) described several years ago as
‘link while you think’, meaning traceability is created by the tools, while you
are performing requirements decomposition, design and development, rather than
as an overhead activity afterwards.
I think we’ve now moved some way beyond ‘link while you
think’. While an information model with ‘just enough’ traceability for your
project needs is essential to avoid traceability spiraling out of control, with
new approaches such as Linked Lifecycle Data
from the OSLC (Open Services for Lifecycle Collaboration) community
and tools that recognize implicit traceability, provide
new ways to visualize lifecycle traceability and perform effective impact
analysis, we can make traceability work for us to help engineering become more
agile, while staying within costs and schedule and produce innovative, higher
quality products and systems.
Last week the UK chapter of INCOSE (International Council on Systems Engineering) held their annual systems engineering conference on the Warwick University campus. I'd like to share some of what I heard during the conference, both on systems engineering in general, and more specifically on requirements management practices in the systems engineering domain.
One of the keynote speakers was Dr Sandy Wilson, President & Managing Director, General Dynamics UK. Dr Wilson spoke about the key challenges in the defense industry - the rate of change in threats and technology and the need to lower costs. He challenged the V model - said it's a nice diagram but its linearity is an issue - the world is not linear or rigid but the SE V diagram is. He spoke about the need for the defense industry to become more agile but that today change is cumbersome due to contractual issues and governance constraints. There are two main types of defense procurement done in the UK - the longer term needs are met by EPs (Equipment Programmes) and the urgent tactical needs by UORs (Urgent Operational Requirements). The former is bogged down in top level scrutiny and check boxes. The latter is helped by the top level sense of urgency and support. An example of a UOR was the decision to implement the multinational no-fly zone over Libya. Dr Wilson proposed that all defense projects should become more like UORs - more agile. He said that "an 80% solution delivered 1 year earlier is better than 90% delivered 4 years late". I heard that delivering incremental capability needs asset management and tracking, configuration management and a more agile approach to systems engineering - valuing "Product over Process". As well as changes in the way companies deliver capabilities, a change is needed in the way the customer (governments) do their acquisition and contracts in order to enable more agility.
Dr Jeremy Dick of Integrate Systems Engineering
and co-author of the book 'Requirements Engineering' presented a case study in the aerospace industry on developing the assurance case for a (safety) critical system in parallel with requirements analysis, design, verification & validation, using an extension of his technique for documenting the rationale for traceability relationships known as 'rich traceability'. In addition to developing a requirements 'flow-down' (through levels of requirements to design), the 'evidence' supporting the flow-down is documented. The evidence in the early stages can be how you expect the lower level requirements or design elements to satisfy the higher level and your evidence to suggest that your argument is sound. In parallel your verification & validation strategies should be evolved, including an argument and supporting evidence for how the test(s) will prove the requirement(s) is/are met. Jeremy was asked how the textual requirements, arguments and evidence would fit with a MBSE (Model-Based Systems Engineering) approach. Jeremy answered that he favours (and in fact came up with the concept of - ref: "The Systems Engineering Sandwich: Combining Requirements, Models and Design", Jeremy Dick, Jonathon Chard, INCOSE International Symposium, Toulouse, July 2004) the sandwich model - interleaved layers of requirements and modeling used to decompose a system specification adn design (you can read more on that concept in the post 'Food for thought: The Systems Engineering Club Sandwich'
Chris Rolison, CEO, Comply Serve
, continued the theme of progressive assurance with focus on the rail industry. Chris highlighted the complexity challenges in major rail infrastructure projects, and the issues presented by paper-based systems, silos in organization structures, and the supply chain. Chris said that "up to 80% of the engineering requirements can change during design & build" - not because the customer changes their mind but because of all the external factors involved in building a rail system. Chris went onto describe a more collaborative, requirements-driven design approach where systems engineering principles are applied, supported by a collaborative platform (ComplyPro which is based on IBM Rational DOORS).
Alastair Mavin of Rolls Royce 'lent' us his EARS (Easy Approach to Requirements Syntax
(link is to an IEEE publication - sign in required) an application of a template with an underlying rule set on how to describe requirements using natural language but in a more structured, consistent way. He described the latest version of the template EARS+ (or as he nicknamed it 'Big EARS' !) and the benefits of the approach - simplicity and structure combined.
I could go on for pages about all of the great content shared at this excellent event but I'll leave it there with the main requirements related topics, except to quote from the keynote speaker on day 2: "The core of Systems Engineering is defining requirements and delivering against them". I'd put it this way - you can't have successful systems engineering without effective requirements management.
Last week I was really fortunate to spend a couple of days in London presenting to and talking to clients, business partners and industry analysts. It's always so good to hear what's really going on out there and to get many different perspectives on what's important today and for the future. The first day was at IBM's Innovate UK 2012 event where I was fortunate to be asked to present on all the really exciting new stuff we've done in the past year to help organizations building today's and the next generation of smarter products and systems, with particular focus on providing solutions for systems engineers and embedded software developers. You can catch the absolute latest news on our recent launch webpages
. That session included a whistle-stop tour of the developments in requirements management for complex systems with Rational DOORS 9.4 and our plans for DOORS Next Generation. Whistle-stop because we also had so much news to get through in architecture & design, planning, change & configuration management and quality management, as well as industry specific solutions for A&D, automotive, medical devices and electronic design. And because on the following day at IBM's Southbank facility we had a whole day dedicated to topics related to DOORS.
At the DOORS customer day we had attendees from across several industry sectors including transportation, aerospace & defense, banking & mail services. The day kicked off with Morgan Brown presenting the latest on IBM's requirements management and DOORS strategy. Morgan told us how the DOORS 9.x series is and will continue to be developed and enhanced to meet the needs of the large install base, in parallel with the introduction of DOORS Next Generation (DOORS NG). DOORS NG is planned to take the best paradigms for managing structured requirements from DOORS 9.x and marry those with the requirements management and team collaboration capabilities that have been developed on the Jazz collaborative lifecycle management platform over the last 4 or so years (and are in use in the form of Rational Requirements Composer). The development of DOORS NG is out in the open on jazz.net
where milestone builds can be downloaded, discussions held, defects/enhancements raised and plans viewed. DOORS NG has gone through four beta releases and is expected to be released in late November. Morgan explained that in its first release, DOORS NG is not intended to replace the DOORS 9.x product line, but it is expected that existing DOORS customers will try out DOORS NG on pilot and new projects, and will use the interoperability capabilities of the ReqIF data exchange and cross-tool traceability linking to exchange and/or link data between DOORS 9.x and DOORS NG. DOORS NG will also appeal to those looking for a requirements management tool that is on an integrated platform with design, test management and task/change management capabilities. Morgan reminded the audience of an IBM statement released earlier this year that existing DOORS customers with active support & subscription would be entitled to use both DOORS 9 and next generation capabilities when they become available. This was well received by the attendees since it means that they can try out DOORS NG when it ships without the need for an additional purchase.
Of course a day of technology insights never goes past without some piece of tech throwing an unexpected spanner in the works. This time it was the projector and the next presenter's Apple Mac that refused to talk to each other, so instead of a flow into a demo of DOORS NG, next up was Neal Middlemore to tell us about the improved integration of requirements and quality management with DOORS 9.4 and Rational Quality Manager
(RQM) 4.0. This release was a significant enhancement that brings the integration in line with IBM's strategy to support OSLC - Open Services for Lifecycle Collaboration
. OSLC is a new approach to tool integration that is open and vendor neutral. What's really different about OSLC is that data no longer needs to be copied or synchronized between tools in order to create cross-tool or cross-discipline visibility or relationships. So now quality professionals working in RQM can see requirements in DOORS and create links between test cases (and now because some organizations require it, test steps) and the requirements they are validating; and requirements professionals in DOORS can see linked test case information including test results, without the need for either to leave the comfort of their familiar tool or for data to be copied between the two tools. Neal demonstrated the value of the integration to requirements & quality professionals and showed how RQM can be used to manage manual testing or hook up with a number of IBM and partner solutions for various forms of test automation. You can also see a demo of the DOORS - RQM integration on YouTube
So, technical issue solved, it was back to Jon Walton to give a demo of DOORS Next Generation using the Beta 4 release. Jon spent most of his time in the web client, highlighting the support for key DOORS paradigms such as hierarchical structured requirements documents, and showed off the plethora of new capabilities provided by the Jazz platform such as database-wide requirement reuse, graphical traceability view, requirements definition techniques (use case diagrams, storyboards), cross-discipline dashboards (containing requirements project info mashed up with info from design, quality and task management) and task management. Jon also showed the desktop client of DOORS NG which is very familiar looking to existing DOORS users with some twists (reuse of requirements across documents for one) - the desktop client will primarily be for users who need to do extensive editing of large requirements documents. If you're currently using DOORS 9.x, this YouTube video gives a quick preview intro
of DOORS NG and how it's both similar to and different to DOORS 9.x. Watch this space for more to come on DOORS NG later this month.
Back to the earlier lifecycle integration theme started by Neal, next to present was Steve Rooks on how to use DOORS with IBM's solution for model-based systems engineering and model-driven embedded software development, Rational Rhapsody, to link requirements and design activities. Rational Rhapsody enables elaboration of requirements and construction of systems and software architectures using SysML and UML. Rhapsody Design Manager
provides an additional level of design collaboration capabilities. Models can be published to and/or stored and managed in a central repository, making them more easily accessible to a wider set of stakeholders so that designs can be better communicated and understood by all those involved in specifying, designing, building and validating a product or system. Rhapsody Design Manager uses OSLC to facilitate linking of design elements to other lifecycle artifacts - requirements, test cases, work items, etc. Like with the DOORS-RQM scenario described above, a systems engineer or software architect working in Rhapsody can see requirements in DOORS and easily create links between requirements and design model elements. Requirements and requirement links can even be included in model diagrams. And of course a DOORS user can see links to design elements without leaving DOORS or to participate in design reviews can navigate into Rhapsody Design Manager. You can read more about linking requirements and design and the DOORS-Rhapsody Design Manager integration in my recent post 'The House That Paul Built'
that talks about a recent webcast
on the topic.
After lunch, an IBM business partner Kovair
was invited to present on how their Kovair Omnibus solution provides bridges, synchronization and workflow support across multiple tools from multiple vendors. It's a common situation to find yourself trying to enact processes and workflows when you have a diverse set of tools. Kovair talked about their support for OSLC to be able to widen the number of tools they can help link together, but also highlighted scenarios where you would still want to copy or transform data between tools - it's not a choice of Link or Sync, it's Link and Sync as appropriate.
The next session was presented by Paul Fechtelkotter, market manager for energy & utilities at IBM Rational. Paul gave a really interesting presentation on the challenges of complex systems development for nuclear power plants and how the nuclear industry is now adopting systems engineering best practices starting with requirements management to enable them to get better change management, traceability, impact analysis and compliance support. You can learn more about how IBM Rational is helping the nuclear industry on our dedicated web page
Unfortunately I had to leave after Paul's session and didn't catch the remainder of the afternoon, but as you can see it was a day packed full of information. I hope you find my summary and links for more information useful. If you have any questions or comments on any of the topics I've covered here or indeed anything on IBM's requirements management strategy, Rational DOORS and the lifecycle integrations, please don't be afraid to ask by using the blog comments facility.
You've bought the plot of land for your dream home. You have your list of requirements - 4 bedrooms, 3 bathrooms, spacious kitchen, 2 living rooms, 2 garages, landscaped gardens, etc. Would you be happy to simply hand that list to the builders and let them start work? Unlikely, I think. Typically, you call in an architect, who can take your quantitative requirements and qualitative desires and produce a blueprint, the architectural design that incorporates your wishes where feasible and adds creative flourish based on the architect's knowledge of house design. The blueprint enables you and the builders to have a much clearer picture of the desired end result than that original list of requirements. And it affords you the opportunity to influence the architecture, and for the builders to question and look at feasibility & cost options, before the foundations are dug and the first bricks are laid.
The same applies in product development. Systems engineers who are responsible for the holistic product specification and design don't just use textual requirements lists to capture the problem domain and describe the proposed solution. They analyze the requirements, identifying integrated scenarios, and often depict those using modeling languages such as UML or SysML. These modeled scenarios are easier to discuss and review with all stakeholders, and as the systems engineer evolves the proposed architecture (also in the same modeling language) they can run the scenarios against the architecture in model simulations to find inconsistencies or gaps in the requirements and flaws in the design, long before any software is coded, circuit boards are soldered or metal is welded.
So what value are our textual requirements lists - should we throw them away in favor of models? Well, not everything can be expressed in the model and not everyone involved in a development effort maybe using models. Going back to the house building analogy, there are contracts, numerous standards and regulations to be adhered to, and simply details that would make the blueprints unreadable. The various contractors (and I know from recent experience that sub-contracting is the name of the game in house building these days!) involved in the building process need to ensure that they can meet the contractual and regulatory demands while delivering against the architecture. Again this is the same in product development, except in many cases, particularly safety-critical systems, traceability and demonstration of conformance to requirements and compliance to standards & regulations are demanded. This requires the ability to integrate requirements and modeling workflows, easily link requirements and design elements, and to report on that linked information.
The need and solutions for this capability are nothing new. Integrations between requirements management and modeling tools have existed for many years (I think I started using such an integration in the early 90's and I'm sure they preceded that time). But I know from first hand experience of using and indeed writing such integrations that they've not always been optimal in the way integration is performed and in the workflow that is supported. Typically it's meant synchronizing (i.e. copying) data between tools in order to create the traceability links in one of the tools. This brings up all sorts of issues like 'which tool is the master?', 'am I looking at the latest data?', 'what happens when information is deleted?', etc.
With Open Services for Lifecycle Collaboration (OSLC
) we now have a much better way to link data across product development and operations tools, even when the tools maybe from different vendors, open source or in-house. OSLC has learnt from the principles of the World Wide Web and enables
tool data to be shared and linked where it resides (called a ‘Linked
Data’ approach). OSLC provides a common vocabulary for ‘resources’ in
particular domains, i.e. what a requirement, test case, design element,
change request, work item, etc. looks like, so that regardless of tool,
technology or vendor, tools implementing OSLC specifications can share and link data.
With Rational DOORS 9.4 and Rational Rhapsody 8.0 with Design Manager 4.0, IBM is utilizing OSLC to provide a simplified workflow for linking requirements analysis and design. On September 20, Paul Urban (if you've been wondering about this blog post title, now you know the Paul I'm speaking of), Market Manager for IBM Rational Rhapsody, presented this simplified workflow and its benefits on a IEEE Spectrum webcast sponsored by IBM. You can watch and listen to the replay at your own leisure here
. I hope you it enjoy it - please let Paul and I know what you think by leaving feedback on this blog post.