Last week I was really fortunate to spend a couple of days in London presenting to and talking to clients, business partners and industry analysts. It's always so good to hear what's really going on out there and to get many different perspectives on what's important today and for the future. The first day was at IBM's Innovate UK 2012 event where I was fortunate to be asked to present on all the really exciting new stuff we've done in the past year to help organizations building today's and the next generation of smarter products and systems, with particular focus on providing solutions for systems engineers and embedded software developers. You can catch the absolute latest news on our recent launch webpages
. That session included a whistle-stop tour of the developments in requirements management for complex systems with Rational DOORS 9.4 and our plans for DOORS Next Generation. Whistle-stop because we also had so much news to get through in architecture & design, planning, change & configuration management and quality management, as well as industry specific solutions for A&D, automotive, medical devices and electronic design. And because on the following day at IBM's Southbank facility we had a whole day dedicated to topics related to DOORS.
At the DOORS customer day we had attendees from across several industry sectors including transportation, aerospace & defense, banking & mail services. The day kicked off with Morgan Brown presenting the latest on IBM's requirements management and DOORS strategy. Morgan told us how the DOORS 9.x series is and will continue to be developed and enhanced to meet the needs of the large install base, in parallel with the introduction of DOORS Next Generation (DOORS NG). DOORS NG is planned to take the best paradigms for managing structured requirements from DOORS 9.x and marry those with the requirements management and team collaboration capabilities that have been developed on the Jazz collaborative lifecycle management platform over the last 4 or so years (and are in use in the form of Rational Requirements Composer). The development of DOORS NG is out in the open on jazz.net
where milestone builds can be downloaded, discussions held, defects/enhancements raised and plans viewed. DOORS NG has gone through four beta releases and is expected to be released in late November. Morgan explained that in its first release, DOORS NG is not intended to replace the DOORS 9.x product line, but it is expected that existing DOORS customers will try out DOORS NG on pilot and new projects, and will use the interoperability capabilities of the ReqIF data exchange and cross-tool traceability linking to exchange and/or link data between DOORS 9.x and DOORS NG. DOORS NG will also appeal to those looking for a requirements management tool that is on an integrated platform with design, test management and task/change management capabilities. Morgan reminded the audience of an IBM statement released earlier this year that existing DOORS customers with active support & subscription would be entitled to use both DOORS 9 and next generation capabilities when they become available. This was well received by the attendees since it means that they can try out DOORS NG when it ships without the need for an additional purchase.
Of course a day of technology insights never goes past without some piece of tech throwing an unexpected spanner in the works. This time it was the projector and the next presenter's Apple Mac that refused to talk to each other, so instead of a flow into a demo of DOORS NG, next up was Neal Middlemore to tell us about the improved integration of requirements and quality management with DOORS 9.4 and Rational Quality Manager
(RQM) 4.0. This release was a significant enhancement that brings the integration in line with IBM's strategy to support OSLC - Open Services for Lifecycle Collaboration
. OSLC is a new approach to tool integration that is open and vendor neutral. What's really different about OSLC is that data no longer needs to be copied or synchronized between tools in order to create cross-tool or cross-discipline visibility or relationships. So now quality professionals working in RQM can see requirements in DOORS and create links between test cases (and now because some organizations require it, test steps) and the requirements they are validating; and requirements professionals in DOORS can see linked test case information including test results, without the need for either to leave the comfort of their familiar tool or for data to be copied between the two tools. Neal demonstrated the value of the integration to requirements & quality professionals and showed how RQM can be used to manage manual testing or hook up with a number of IBM and partner solutions for various forms of test automation. You can also see a demo of the DOORS - RQM integration on YouTube
So, technical issue solved, it was back to Jon Walton to give a demo of DOORS Next Generation using the Beta 4 release. Jon spent most of his time in the web client, highlighting the support for key DOORS paradigms such as hierarchical structured requirements documents, and showed off the plethora of new capabilities provided by the Jazz platform such as database-wide requirement reuse, graphical traceability view, requirements definition techniques (use case diagrams, storyboards), cross-discipline dashboards (containing requirements project info mashed up with info from design, quality and task management) and task management. Jon also showed the desktop client of DOORS NG which is very familiar looking to existing DOORS users with some twists (reuse of requirements across documents for one) - the desktop client will primarily be for users who need to do extensive editing of large requirements documents. If you're currently using DOORS 9.x, this YouTube video gives a quick preview intro
of DOORS NG and how it's both similar to and different to DOORS 9.x. Watch this space for more to come on DOORS NG later this month.
Back to the earlier lifecycle integration theme started by Neal, next to present was Steve Rooks on how to use DOORS with IBM's solution for model-based systems engineering and model-driven embedded software development, Rational Rhapsody, to link requirements and design activities. Rational Rhapsody enables elaboration of requirements and construction of systems and software architectures using SysML and UML. Rhapsody Design Manager
provides an additional level of design collaboration capabilities. Models can be published to and/or stored and managed in a central repository, making them more easily accessible to a wider set of stakeholders so that designs can be better communicated and understood by all those involved in specifying, designing, building and validating a product or system. Rhapsody Design Manager uses OSLC to facilitate linking of design elements to other lifecycle artifacts - requirements, test cases, work items, etc. Like with the DOORS-RQM scenario described above, a systems engineer or software architect working in Rhapsody can see requirements in DOORS and easily create links between requirements and design model elements. Requirements and requirement links can even be included in model diagrams. And of course a DOORS user can see links to design elements without leaving DOORS or to participate in design reviews can navigate into Rhapsody Design Manager. You can read more about linking requirements and design and the DOORS-Rhapsody Design Manager integration in my recent post 'The House That Paul Built'
that talks about a recent webcast
on the topic.
After lunch, an IBM business partner Kovair
was invited to present on how their Kovair Omnibus solution provides bridges, synchronization and workflow support across multiple tools from multiple vendors. It's a common situation to find yourself trying to enact processes and workflows when you have a diverse set of tools. Kovair talked about their support for OSLC to be able to widen the number of tools they can help link together, but also highlighted scenarios where you would still want to copy or transform data between tools - it's not a choice of Link or Sync, it's Link and Sync as appropriate.
The next session was presented by Paul Fechtelkotter, market manager for energy & utilities at IBM Rational. Paul gave a really interesting presentation on the challenges of complex systems development for nuclear power plants and how the nuclear industry is now adopting systems engineering best practices starting with requirements management to enable them to get better change management, traceability, impact analysis and compliance support. You can learn more about how IBM Rational is helping the nuclear industry on our dedicated web page
Unfortunately I had to leave after Paul's session and didn't catch the remainder of the afternoon, but as you can see it was a day packed full of information. I hope you find my summary and links for more information useful. If you have any questions or comments on any of the topics I've covered here or indeed anything on IBM's requirements management strategy, Rational DOORS and the lifecycle integrations, please don't be afraid to ask by using the blog comments facility.
Eric has worked in the software development industry for over 20 years and is co-author of UML for Database Design and UML for Mere Mortals both published by Addison Wesley. Eric is currently responsible for capabilities marketing of Rational’s application lifecycle management solutions including Agile Software Delivery, Quality & Test Management, Requirements Management and Collaborative Lifecycle Management. He rejoined IBM in 2008 as the team leader for InfoSphere Optim Solutions and later was responsible for Information Governance Solutions. Prior to rejoining IBM, he worked for Ivar Jacobson Consulting as VP of Sales and Marketing. Before joining Ivar Jacobson, he was director of product marketing for CAST Software. Previously working for IBM, Eric held several roles within the Rational Software group including program director for business, industry and technical solutions, product manager for Rational Rose and team market manager for Rational Desktop Product. He also spent several years with Logic Works Inc. (Acquired by Platinum Technologies and CA), as product manager for ERwin.
As I think about IT today, there comes a rebirth in some ways of the importance of architecture and requirements. We are in an era of “ANY” -- meaning that applications and data can be accessed from anywhere, by anyone, and at any time.
Looking back at the applications of yesteryear (two or three years ago), we didn’t expect much from the web or mobile-based applications. We could view, run some reports or do some basic tasks, but to do the real work, we needed to go to the fat-client. Now, in today’s era of any, the user interface may look different, but the capabilities had better be the same since we expect near full capabilities no matter our device or interface.
This puts a new found set of requirements on applications and their development, and is making modeling and requirements (analysis and design) relevant again, but with a new twist – AGILITY. It is no longer a question of “what platform am I developing for” – the question is how quickly can we get it up and running on the latest version of Apple, Android, HTML 5 and whatever other platforms our clients expect the application to run on … and it had better run on all of the latest versions, with no delays, when updated operation systems come out.
And the question that I often receive now, however, is “can I be agile and meet these needs at the same time”? The plain answer is, yes, you can. However, agility doesn’t me you cannot ignore requirements and design. I am not talking about write-once, run-anywhere, rather instead understand the true requirements so that the various development teams can articulate them in code brought to life as features for the users, as they expect to see them. Users are looking for the application to be specific to their hardware/OS (iPad/AppleOS, Droid/Android…) as the hardware has become the platform for not just running the application, but the expected look, feel and usability of it now, too. This often means different developers for different deployment platforms, certainly at the User Interface level.
Designing applications requires that we are prepared. Architectures must be solidified and communicated. Requirements must be consistent and shared. We must model architectures so that developers can build to the designs and not recreate their own, wasting time and resources, and we must share those designs across the team.
Does this get in the way of agility? NO, it will speed agility. By sharing designs, assigning tasks based on architecture needs, we can speed time to market and our ability to deliver high quality software. In the era of any, we may have multiple teams working on the same front-end capabilities for different platforms even though the back-end is the same. But the more they can share, the faster they can be deployed and having the right requirements from users, the more satisfied they will be. We see people changing their desired platform as employers, vendors and suppliers change requirements, so we need to be prepared for the customer who is using an iPad today to be using an Android device tomorrow with the same requirements on the application. Just look at how the world of Blackberry has evolved.
So, as you think about your next project, don’t skimp on requirements and architectures or you may be limiting your agility in the future rather than speeding your time to satisfied clients.
Nowadays, software is present everywhere and software projects are becoming complex in terms of scope, time and cost. Associated with such a change increases the potential failure rate of software projects. How can these potential failures be avoided? While a guarantee may not be possible, adequate investments in managing the risk of failure can be provided. A typical textbook definition of software risk management is the identification of risks, analysis of identified risks and establishing plans to address those risks. The important goal of risk management is to avoid the occurrence of such risks. Similar to requirements management, risk management needs to be started early in the development life cycle process.
ISO/IEC 16085:2006 defines risk as a combination of the probability of an event and its consequence. What are the major sources of risks in a software project? An obvious answer to that question today would be the prevailing uncertainty added by time and budget pressures. Inaccurate requirements capture, is another important reason for increased risks in the later stages of the life cycle. Boehm
has done some phenomenal work in managing risks in software projects. He essentially identifies ten risk aspects – Personal shortfalls, unrealistic schedules & budgets, development risks (building wrong functions, properties or user interfaces), adding unnecessary features, continuing requirements changes, shortfalls (in externally furnished components & performed tasks), performance shortfalls and technological strains.
So how do you best manage the risks?
– Boehm divides the first level of activities into Assessment and Control. Assessment essentially contains identifying the risks, analyzing the identified risks and finally prioritizing the risks. Control aspect deals with planning, resolution of identified risks and monitoring. If you consider the Top 10 items he has identified, requirements mismatches, requirements changes and architecture performance & quality are among the top. Various techniques are discussed in Risk Management literature which is beyond the scope of this blog post. These techniques involve basic ones like maintaining a risk register to decision tree analysis
, to risk exposure profiling. Murray Cantor, a Distinguished Engineer at IBM regularly writes about risks in his blog here
What are some of the generic strategies to managing risks? – The predominant method is to buy more information; for example if you are in the early development cycle, you could always try prototyping to make sure you and your client are on the same page of understanding. This also helps in revealing the possible root causes of risks. Other options are to avoid the risk by de-scoping requirements, transferring it (for example outsourcing the component to an expert vendor or a sub-contractor), have mitigation plans or as the last option, accept the risk and have a Plan B. ISO 31000:2009, a relatively new standard introduced in 2009 related to risk management, provides a generic framework for a risk management process which a team can consider implementing.
How can tools help manage the risks?
Risk includes both opportunities and threats - that is a risk can have both a positive and negative effect. Tools help in implementing an integrated risk process that enables maximization of value creation resulting in faster time to markets and improved productivity, at the same time avoiding the threats of cost and time over run and project closures. Tools can help significantly in two ways - conducting the qualitative and quantities risk analysis activities and actually implementing the outcomes for managing risks. Check this case study of Chubb Insurance
that manages effectively its risk using IBM Rational Focal Point. And finally here is a developerWorks article on how to calculate your return on investment for software and systems
You've bought the plot of land for your dream home. You have your list of requirements - 4 bedrooms, 3 bathrooms, spacious kitchen, 2 living rooms, 2 garages, landscaped gardens, etc. Would you be happy to simply hand that list to the builders and let them start work? Unlikely, I think. Typically, you call in an architect, who can take your quantitative requirements and qualitative desires and produce a blueprint, the architectural design that incorporates your wishes where feasible and adds creative flourish based on the architect's knowledge of house design. The blueprint enables you and the builders to have a much clearer picture of the desired end result than that original list of requirements. And it affords you the opportunity to influence the architecture, and for the builders to question and look at feasibility & cost options, before the foundations are dug and the first bricks are laid.
The same applies in product development. Systems engineers who are responsible for the holistic product specification and design don't just use textual requirements lists to capture the problem domain and describe the proposed solution. They analyze the requirements, identifying integrated scenarios, and often depict those using modeling languages such as UML or SysML. These modeled scenarios are easier to discuss and review with all stakeholders, and as the systems engineer evolves the proposed architecture (also in the same modeling language) they can run the scenarios against the architecture in model simulations to find inconsistencies or gaps in the requirements and flaws in the design, long before any software is coded, circuit boards are soldered or metal is welded.
So what value are our textual requirements lists - should we throw them away in favor of models? Well, not everything can be expressed in the model and not everyone involved in a development effort maybe using models. Going back to the house building analogy, there are contracts, numerous standards and regulations to be adhered to, and simply details that would make the blueprints unreadable. The various contractors (and I know from recent experience that sub-contracting is the name of the game in house building these days!) involved in the building process need to ensure that they can meet the contractual and regulatory demands while delivering against the architecture. Again this is the same in product development, except in many cases, particularly safety-critical systems, traceability and demonstration of conformance to requirements and compliance to standards & regulations are demanded. This requires the ability to integrate requirements and modeling workflows, easily link requirements and design elements, and to report on that linked information.
The need and solutions for this capability are nothing new. Integrations between requirements management and modeling tools have existed for many years (I think I started using such an integration in the early 90's and I'm sure they preceded that time). But I know from first hand experience of using and indeed writing such integrations that they've not always been optimal in the way integration is performed and in the workflow that is supported. Typically it's meant synchronizing (i.e. copying) data between tools in order to create the traceability links in one of the tools. This brings up all sorts of issues like 'which tool is the master?', 'am I looking at the latest data?', 'what happens when information is deleted?', etc.
With Open Services for Lifecycle Collaboration (OSLC
) we now have a much better way to link data across product development and operations tools, even when the tools maybe from different vendors, open source or in-house. OSLC has learnt from the principles of the World Wide Web and enables
tool data to be shared and linked where it resides (called a ‘Linked
Data’ approach). OSLC provides a common vocabulary for ‘resources’ in
particular domains, i.e. what a requirement, test case, design element,
change request, work item, etc. looks like, so that regardless of tool,
technology or vendor, tools implementing OSLC specifications can share and link data.
With Rational DOORS 9.4 and Rational Rhapsody 8.0 with Design Manager 4.0, IBM is utilizing OSLC to provide a simplified workflow for linking requirements analysis and design. On September 20, Paul Urban (if you've been wondering about this blog post title, now you know the Paul I'm speaking of), Market Manager for IBM Rational Rhapsody, presented this simplified workflow and its benefits on a IEEE Spectrum webcast sponsored by IBM. You can watch and listen to the replay at your own leisure here
. I hope you it enjoy it - please let Paul and I know what you think by leaving feedback on this blog post.
The importance of communication and collaboration in developing and managing good requirements were discussed in our earlier post on How to enable effective requirements communication and collaboration
. In this guest blog post, Melissa Robinson - a Senior Technical Specialist at IBM writes about how Rational DOORS addresses this aspects with Discussions. Melissa started her career at Telelogic enabling Product Management with technical support around requirements management. Melissa spent 3 years supporting clients getting started with Requirements management at Telelogic. After IBM acquired Telelogic in 2008, Melissa transitioned roles to support clients with Enteprise Architecture initiatives. She received the Carnegie Mellon certification in Enterprise Architecture in 2008 and is TOGAF certified. Melissa now supports clients getting started with evaluating and implementing both requirements management and enterprise architecture solutions.
Note: Please click on the screenshots for a better view
Why did we make this decision? Who made this decision? Who approved this requirement?
These are some of the questions we can help answer with effective collaboration messaging with DOORS. Collaboration messaging is now enabled in DOORS and DOORS Web Access (DWA) with the addition of DOORS Discussions. Discussions allow users to contribute and add comments to requirement objects or requirement modules, users can even add comments to base-lined requirements. Discussions offer a method of having a conversation on requirements. DOORS discussions really break the communication barrier by allowing users to easily make comments or start a discussion on any requirement, including read-only requirements. Discussions can be created in DOORS or DWA and viewed in both DOORS and DWA. Both DWA Editor and DWA Reviewer roles can contribute to Discussions. Discussions capture comments so that you can later review ancillary information about your requirements. Discussions allow everyone to contribute comments and provide a full understanding of requirements.
Here is a simple scenario for using DOORS Discussions. A DWA Reviewer user creates a Discussion on a requirement. A DOORS user then reviews this comment and contributes a comment on the requirement. The DWA user reviews the latest comment and closes the Discussion.
A DOORS user, Susan, reviews the current Discussion created by a DWA Reader user, Kavita. Susan can open the requirements module with a pre-created Discussion view to review the Discussions. Below Susan reviews the Discussion on Requirement AMR-STK-66.
Susan can contribute a comment to the open Discussion.
Kavita reviews the new Discussion comment in DWA. Notice that Kavita is a Reviewer in DWA. As a Reviewer, she can create and add comments to Discussions. Kavita can also close Discussions that she started. Later, Kavita can contribute another comment to the open Discussion.
As the person who first opened the Discussion, Kavita can close this Discussion.Later, in DOORS, Susan can review the latest status of the Discussion using the Discussion Thread view. As a Database Manager role, Susan can choose to re-open the closed Discussion at any time.
Discussions open up the communication thread between several different types of DOORS users. Discussions allow requirements reviewers to exchange views and comments about the content of a requirements module or the content of a requirement object in a module.
We believe the post gave you a sneak preview of how DOORS Discussions help in effectively collaborating and communicating between various stakeholders during requirements management. Feel free to contact melissarobinson[at]us.ibm.com if you have any queries about the topic. Melissa will be discussing the topic in detail in an upcoming webcast on October 5, 2012. Don't miss the opportunity to watch the action live.
Register now @ http://bit.ly/DOORS_Discussions
In this post Jim Hays
writes about the various options available in testing how the requirements are met in IBM Rational DOORS.
Jim works as a Senior Systems Engineer at IBM. Jim started his career in 1982 working for software providers. Jim’s career history is an interesting one; he hasn’t worked for many software companies over the course of his career. He worked for Applied Data Research (7 years) that eventually got bought by Computer Associates. He then moved to Goal Systems, that got bought by Legent (6 years). Legent, then got bought by Computer Associates. He then spent almost 10 years at Sterling Commerce that just got bought by IBM. After Sterling Commerce he joined Telelogic where he got into the ALM market that eventually got purchased by IBM Rational (7 years).
I’ve been involved with DOORS for over 7 years, and absolutely love the tool. My job at IBM is to technically support our sales team, and our customers, not only for DOORS but many other solutions we offer. I have had a lot of experience working with our DOORS customers understanding how they use DOORS, and even though DOORS is a requirements management solution there are other types of information being put into DOORS other than just requirements. An example of that is the fact that customers will put in test data into a DOORS module, and enable easy linking between the requirements and their related validation/verification results.
Note: Please click on the screenshots for a better view
Provided below is an example showing that. In this DOORS View we see traceability between 4 modules:
User Requirements>Functional Requirements>Functional Test Plan>Functional Test Cases
DOORS has had in it for years a capability called the Test Tracking Toolkit. This enables one to capture in a DOORS module test results by duplicating attributes based upon creating a new test run to store test run results for each run uniquely. This over time will create a lot of attributes in order to capture and store test run results per run. Both of these described usages of capturing tests and test results enable quite easy linking between the DOORS requirements and their related tests and test results. Below are the options available utilizing the Test Tracking toolkit.
(Read in clockwise from top left)
So what are the positives and negatives of both of these usages of DOORS modules to capture test, validation, and/or verification information. The positive is the ability to store and easily link requirements to testing results using standard DOORS linking. If requirements change that are linked to test-based modules, then standard DOORS “suspect links” would notify the folks maintaining the test-based DOORS modules of that requirement change to see if they need to update their test plan and or have to retest the test case. The other question is who is maintaining/updating the DOORS test-based modules? Are the actual testers going into DOORS to update the test results? Are the testers using a spreadsheet to capture test results and giving that to a DOORS user to update in DOORS? It is my opinion that either one of these scenarios discussed are fine for projects that only do manual testing, and don’t have a lot of testing (i.e. test runs) to perform. The other issue is that the actual testing can be occurring on different environments. For example if one built an application that is web-based and will run on different operating systems, then one would need to test all of those types of configurations . If one is building an embedded software device or a software application, then one might want to not just do manual
testing, but instead automate the execution and capturing results via automated
In my opinion I think a solution like DOORS for requirements management is great for that; whereas, I believe the folks that are in charge of Quality and/or performing the task of testing should have a solution that is suited for the role they play in a project-Test Management. So the final option for testing I will discuss is how DOORS (managing requirements) can integrate to IBM’s Test Management solution called Rational Quality Manager (RQM). RQM can provide a nice environment for the support of both manual and automated testing.
Provided below are an example “dashboard” that users can configure based upon what they would like to see and an example of a RQM Test Plan.
Provided below is an example of how one used a DOORS View that would be a view of requirements from DOORS that are known by this particular RQM project. Requirements driven testing enable requirements from DOORS to be used to automatically generate Test Cases and build specific links between the DOORS requirements to specific test cases.Screenshot provided in the right show the results of that link creation automation by showing traceability between test cases to DOORS requirements, and also could show development software assets.
The integration between DOORS and RQM is utilizing the OSLC (Open Services for Lifecycle Collaboration). Below is showing a “rich hover” ability to see details about linked items without actually having to navigate the link. One can also see the results of the test execution (pass or fail)
As the testers are doing their work then the requirements from DOORS can map data from RQM into DOORS-based attributes. Below is an example showing the traceability between DOORS requirements and the testing side of things in DOORS. I can see that the latest test case run passed.Provided below are screenshots showing coverage analysis relating the DOORS requirements to the Test Plan and associated test case.
Finally, provided below screenshot shows the test case execution results that were performed via RQM. These are mapped to DOORS attributes via the bi-directional integration and regular DOORS sorting and filtering can be used. For example if I wanted to see what Test Cases failed and or passed.
Hope the blog post was useful. Feel free to contact Jim @ haysji[at]us.ibm.com if you have any queries regarding the options for testing in DOORS.
Also, Jim will be hosting a webinar on the same topic in which he will go in depth the ideas presented in this post about the options related to testing in conjunction to using DOORS. Register for the session here - http://bit.ly/DOORS-Testing_Options
Date: September 7, 2012 (Friday)
Time: 1PM EDT
There is no doubt that the evolution of computing has moved into the era of the mobile device and any business that wants to remain competitive in an increasingly difficult economic environment needs to embrace this with the care and attention it deserves.
With over a million devices being activated every single day it is predicted that by 2013 mobile phones will overtake the PC as the most common web access device worldwide!
Companies that simply try and tweak their existing web sites to run in mobile browsers are missing a trick as users expect very sleek interfaces that make use of their devices capabilities such as Geo-location and camera. With this in mind the perceived quality of the application comes from both its functionality and perhaps more importantly the design. The design of the applications user interface is also critical when trying to improve brand loyalty and therefore businesses are keen to see their brand image extended to mobile devices where they can reach a much wider audience.
So where does Requirements Management fit in?
Well, a good requirements management process that is incorporated into the wider development life cycle will allow the business to communicate exactly what is required of the mobile application; enabling the development team to be clear on what needs to be created, and for testers to begin writing test cases earlier in the lifecycle. With the need for good user interface design it is important that requirements are not purely textual and so with tools like Rational Requirements Composer
, Business Analysts can model Use Cases, Business Processes and also visualize the user interface through UI Sketches, Screenflows and Storyboards. This means the business can be very clear on what the expected result should be and remove any unnecessary ambiguity that only slows down development and ultimately prevents applications going to market quickly.
One significant trend in the development of mobile apps is the adoption of Agile methodologies such as Scrum, which fits in well with the short timescales and rapid change and release management that mobile apps have. This makes it even more important for the Requirements Management process to also be agile and allow greater collaboration not only within the Business Analysts team, but also with the other teams involved such as development and testing. A web based requirements management tool like RRC encourages wider engagement with stakeholders and also provides dashboards with live information, collaborative reviews and reporting to promote visibility and allow decisions to be made quickly and effectively.
Traceability is one of the corner stones of good Requirements Management, because without it you cannot determine if the resulting product has actually satisfied the original requirements outlined by the business. RRC is a part of the wider solution called Collaborative Life cycle Management and allows the requirements to be linked to the resulting development work items and associated test plans, test cases and defects. This means that from any given requirement you can see exactly what development task is going to implement it, which test case is going to test it and what defects were found in relation to that requirement.
In summary, mobile application development is an exciting and important part of any business plan and due to its inherent complexity you really need the right tools to make it a success.