In this post Jim Hays writes about the various options available in testing how the requirements are met in IBM Rational DOORS.
Jim works as a Senior Systems Engineer at IBM. Jim started his career in 1982 working for software providers. Jim’s career history is an interesting one; he hasn’t worked for many software companies over the course of his career. He worked for Applied Data Research (7 years) that eventually got bought by Computer Associates. He then moved to Goal Systems, that got bought by Legent (6 years). Legent, then got bought by Computer Associates. He then spent almost 10 years at Sterling Commerce that just got bought by IBM. After Sterling Commerce he joined Telelogic where he got into the ALM market that eventually got purchased by IBM Rational (7 years).
I’ve been involved with DOORS for over 7 years, and absolutely love the tool. My job at IBM is to technically support our sales team, and our customers, not only for DOORS but many other solutions we offer. I have had a lot of experience working with our DOORS customers understanding how they use DOORS, and even though DOORS is a requirements management solution there are other types of information being put into DOORS other than just requirements. An example of that is the fact that customers will put in test data into a DOORS module, and enable easy linking between the requirements and their related validation/verification results.
Note: Please click on the screenshots for a better view
Provided below is an example showing that. In this DOORS View we see traceability between 4 modules:
User Requirements>Functional Requirements>Functional Test Plan>Functional Test Cases
DOORS has had in it for years a capability called the Test Tracking Toolkit. This enables one to capture in a DOORS module test results by duplicating attributes based upon creating a new test run to store test run results for each run uniquely. This over time will create a lot of attributes in order to capture and store test run results per run. Both of these described usages of capturing tests and test results enable quite easy linking between the DOORS requirements and their related tests and test results. Below are the options available utilizing the Test Tracking toolkit.
(Read in clockwise from top left)
So what are the positives and negatives of both of these usages of DOORS modules to capture test, validation, and/or verification information. The positive is the ability to store and easily link requirements to testing results using standard DOORS linking. If requirements change that are linked to test-based modules, then standard DOORS “suspect links” would notify the folks maintaining the test-based DOORS modules of that requirement change to see if they need to update their test plan and or have to retest the test case. The other question is who is maintaining/updating the DOORS test-based modules? Are the actual testers going into DOORS to update the test results? Are the testers using a spreadsheet to capture test results and giving that to a DOORS user to update in DOORS? It is my opinion that either one of these scenarios discussed are fine for projects that only do manual testing, and don’t have a lot of testing (i.e. test runs) to perform. The other issue is that the actual testing can be occurring on different environments. For example if one built an application that is web-based and will run on different operating systems, then one would need to test all of those types of configurations . If one is building an embedded software device or a software application, then one might want to not just do manual testing, but instead automate the execution and capturing results via automated testing.
In my opinion I think a solution like DOORS for requirements management is great for that; whereas, I believe the folks that are in charge of Quality and/or performing the task of testing should have a solution that is suited for the role they play in a project-Test Management. So the final option for testing I will discuss is how DOORS (managing requirements) can integrate to IBM’s Test Management solution called Rational Quality Manager (RQM). RQM can provide a nice environment for the support of both manual and automated testing.
Provided below are an example “dashboard” that users can configure based upon what they would like to see and an example of a RQM Test Plan.
Provided below is an example of how one used a DOORS View that would be a view of requirements from DOORS that are known by this particular RQM project. Requirements driven testing enable requirements from DOORS to be used to automatically generate Test Cases and build specific links between the DOORS requirements to specific test cases.Screenshot provided in the right show the results of that link creation automation by showing traceability between test cases to DOORS requirements, and also could show development software assets.
The integration between DOORS and RQM is utilizing the OSLC (Open Services for Lifecycle Collaboration). Below is showing a “rich hover” ability to see details about linked items without actually having to navigate the link. One can also see the results of the test execution (pass or fail)
As the testers are doing their work then the requirements from DOORS can map data from RQM into DOORS-based attributes. Below is an example showing the traceability between DOORS requirements and the testing side of things in DOORS. I can see that the latest test case run passed.Provided below are screenshots showing coverage analysis relating the DOORS requirements to the Test Plan and associated test case.
Finally, provided below screenshot shows the test case execution results that were performed via RQM. These are mapped to DOORS attributes via the bi-directional integration and regular DOORS sorting and filtering can be used. For example if I wanted to see what Test Cases failed and or passed.
Hope the blog post was useful. Feel free to contact Jim @ haysji[at]us.ibm.com if you have any queries regarding the options for testing in DOORS.
Also, Jim will be hosting a webinar on the same topic in which he will go in depth the ideas presented in this post about the options related to testing in conjunction to using DOORS. Register for the session here - http://bit.ly/DOORS-Testing_Options
There is no doubt that the evolution of computing has moved into the era of the mobile device and any business that wants to remain competitive in an increasingly difficult economic environment needs to embrace this with the care and attention it deserves.
With over a million devices being activated every single day it is predicted that by 2013 mobile phones will overtake the PC as the most common web access device worldwide!
Companies that simply try and tweak their existing web sites to run in mobile browsers are missing a trick as users expect very sleek interfaces that make use of their devices capabilities such as Geo-location and camera. With this in mind the perceived quality of the application comes from both its functionality and perhaps more importantly the design. The design of the applications user interface is also critical when trying to improve brand loyalty and therefore businesses are keen to see their brand image extended to mobile devices where they can reach a much wider audience.
So where does Requirements Management fit in?
Well, a good requirements management process that is incorporated into the wider development life cycle will allow the business to communicate exactly what is required of the mobile application; enabling the development team to be clear on what needs to be created, and for testers to begin writing test cases earlier in the lifecycle. With the need for good user interface design it is important that requirements are not purely textual and so with tools like Rational Requirements Composer, Business Analysts can model Use Cases, Business Processes and also visualize the user interface through UI Sketches, Screenflows and Storyboards. This means the business can be very clear on what the expected result should be and remove any unnecessary ambiguity that only slows down development and ultimately prevents applications going to market quickly.
One significant trend in the development of mobile apps is the adoption of Agile methodologies such as Scrum, which fits in well with the short timescales and rapid change and release management that mobile apps have. This makes it even more important for the Requirements Management process to also be agile and allow greater collaboration not only within the Business Analysts team, but also with the other teams involved such as development and testing. A web based requirements management tool like RRC encourages wider engagement with stakeholders and also provides dashboards with live information, collaborative reviews and reporting to promote visibility and allow decisions to be made quickly and effectively.
Traceability is one of the corner stones of good Requirements Management, because without it you cannot determine if the resulting product has actually satisfied the original requirements outlined by the business. RRC is a part of the wider solution called Collaborative Life cycle Management and allows the requirements to be linked to the resulting development work items and associated test plans, test cases and defects. This means that from any given requirement you can see exactly what development task is going to implement it, which test case is going to test it and what defects were found in relation to that requirement.
In summary, mobile application development is an exciting and important part of any business plan and due to its inherent complexity you really need the right tools to make it a success.
I am sure you would have seen a graphic similar to this depicting the communication gap between stakeholders in a project and its consequences. Today projects are getting increasingly complex, teams involved in them are often distributed and delivery time is getting reduced. Faster to market has become the major contributing factor to success for most of companies. Being unable to finish development on time and budget and thus missing opportunities is a vexing problem for organizations.
Clearly articulating stakeholder business objectives and requirements for application and product development gives the much needed head start to optimize end results; however tackling the challenge of managing effective communication between development teams and providing a mutually supportive collaborative environment helps ensure a successful project completion. Studies conducted by IBM have shown an improvement of 15-35% in team productivity with the help of effective collaboration. There are two aspects to communication – how to engage stakeholders and how to manage internal team communication.
Managing stakeholder communication
It’s imperative to engage stakeholders early on to get the requirements right. If you are an agile project environment, having consistent involvement of stakeholders becomes even more important and challenging. Providing stakeholder access to the project environment with appropriate levels of access enables this. Thorough requirements definition practices involve understanding as many specifics as possible and should start at the very beginning of the project.
Managing inter-team and stakeholder communication
Unifying stakeholders and the project team, helps to ensure that project goals are met and averting the potential impact that being late to market can have on the bottom line. Up-front visual and textual requirements elicitation techniques to build stakeholder consensus, for example, coupled with full requirements traceability across the life cycle, helps cut risk and the cost of rework from unclear, ambiguous or changing requirements. In the end, this can help improve the time to value and quality. Ralph R Young (Effective Requirements Practices) identifies three root causes for requirements related issues – wide disparity between stated requirements by customers and the real requirements expected ineffective requirements practices in supply chain and finally the lack of joint customer/supplier responsibility for the project success.
While the personal communication tactics like brainstorm meetings and knowledge sharing sessions can add immense value, the present day globally distributed environment requires more day-to-day closely knit solutions for communication. The requirements tool used should ideally have the capability to address a wide set of requirements information beyond the requirements themselves: business context, aspirations, considerations, and business and technical constraints. Capabilities like shared repository, simultaneous view of what team members are working on open issues, group conversations about requirements, and online reviews & approvals can significantly improve communication in the team and increase productivity. Linking requirements artifacts to related information in a repository, and embedding artifacts into documents and user interface sketches (empowering for rapidly refining ideas) have significant advantages. Also, this broad and flexible approach enables teams anywhere in the world to collaborate, clarify and achieve consensus quickly about the requirements as they develop business driven solutions.
Clearly, with geographically distributed teams, teamwork has new dimensions. If an organization gets collaboration right, it can potentially drive higher levels of productivity—and innovation
In my career I’ve been deeply involved with both modeling and requirements management disciplines and tools, so it always intrigues me when I hear debates over whether largely textual based (sometimes referred to as ‘traditional’ or ‘document-based’) or model-based approaches to defining and managing requirements are the right way to go.
We’ve all heard the argument that a picture paints a thousand words, but I’ve always vividly remembered something I heard at a conference some years ago which was “I’d have taken a 1000 words over this one unreadable diagram.”
My belief is that it is not an either-or decision. You need both. Models can add clarity to requirements specifications and can bring together a more holistic understanding of what’s expressed in the requirements. Models can be walked through with stakeholders and with the right language and tools (like SysML or UML in IBM Rational Rhapsody), they can even be run to validate that what is captured in the model is correct, consistent and complete. But what if you have contractual requirements to manage, documents of regulations or standards to comply with, or complex performance or availability constraints – you don’t want to clutter your model with so much detail that it becomes unusable.
My preference is for a combination of textual requirements and models, that can be described by the ‘Systems Engineering Club Sandwich’ (references 1&2) where textual requirements, which form the layers of bread - and maybe a bit dry on their own, are supplemented by models that form the layers of filling – they are richer and more expressive, together forming a tasty combination to help explore and elaborate requirements, perform decomposition and allocation, and maintain traceability. I recently got together with my colleague Paul Urban to record a 30 minute webcast entitled ‘The Tasty Way to Tackle Complexity - The Systems Engineering Club Sandwich of Requirements & Models’ where we take a look at some engineering challenges, where requirements work goes wrong, how the club sandwich approach works and how to use requirements and models together effectively. So if this hors d'oeuvre has made you hungry for more, please take a look. Paul and I are really interested to hear what you think.
References: 1. "The Systems Engineering Sandwich: Combining Requirements, Models and
Design", Jeremy Dick, Jonathon Chard, INCOSE International Symposium,
Toulouse, July 2004. 2. Requirements Engineering, Hull, Jackson
& Dick, Springer 2004.
I was lucky enough last week to travel to the INCOSE
(International Council on Systems Engineering) International Symposium 2012 near Rome, Italy.
An excellent opportunity to meet the systems engineering community and hear
about their interests and concerns. We had lots of traffic to the very stylish
IBM booth where we talked about the IBM Rational solutions for systems
engineering and the latest from IBM Research on tool interoperability and
design optimization & trade-off. I’d like to claim the traffic was to due
to my presence but in fact there was lots of excitement and interest in the
must have giveaway of the conference, the IBM Limited Edition of Systems
Engineering for Dummies book
(if you weren’t there and don’t have a copy, you
can download a PDF version).
Being at the INCOSE event reminded me of the very active and
interesting discussion I recently provoked on the INCOSE LinkedIn group with the posting of the link to my
previous blog post ‘Traceability – How Much is Enough?’.
It’s a great read with some very provocative statements about whether
traceability is at all useful and that it’s the root cause of failure on
projects that overrun and overspend versus those that say it’s absolutely vital
on safety-critical systems or where the project is contract-driven. In the end I
think some consensus was reached between these two camps that ‘just enough’
traceability to keep a project on track, provide customer/market need context
to engineers, facilitate impact analysis, and (if needed) to meet industry
standards and regulations, is sufficient. Any more is excessive and wasteful
and likely to bog down progress towards to delivering innovative products and
During a quiet time at the IBM booth, I also had chance to
chat with my colleague Brian Nolan (marketing manager for aerospace &
defense industry at IBM Rational) about effective traceability, since Brian
is very interested in this topic and has presented on a Dr Dobbs
webcast on ‘3 Ways to Improve Traceability and Impact Analysis’.
Brian believes in what I would describe as ‘traceability by design’, meaning
that traceability is automatically established while you decompose your system
design (for example, use case to use case realization to sequence diagram and
so on). This discussion also reminded me of what another colleague Greg Gorman
(program director for IBM systems and software engineering solutions and the
INCOSE Corporate Advisory Board member from IBM) described several years ago as
‘link while you think’, meaning traceability is created by the tools, while you
are performing requirements decomposition, design and development, rather than
as an overhead activity afterwards.
I think we’ve now moved some way beyond ‘link while you
think’. While an information model with ‘just enough’ traceability for your
project needs is essential to avoid traceability spiraling out of control, with
new approaches such as Linked Lifecycle Data from the OSLC (Open Services for Lifecycle Collaboration) community, and tools that recognize implicit traceability, provide
new ways to visualize lifecycle traceability and perform effective impact
analysis, we can make traceability work for us to help engineering become more
agile, while staying within costs and schedule and produce innovative, higher
quality products and systems.
We had an interesting webinar with Mary Gorman, EBG Consulting on whether Business Analysis is required in agile projects? Mary talked about lots of concepts and put forth her case on how business analysis is essential for agile success. If you didn't attend the webinar, an on demand replay is available here. Also read our interview with Mary on this topic here.
Mary shared with us five agile business analysis actualities that are key to agile project successes. Two of them stuck my mind – delivering valued products and product partnerships. Often we tend to forget hard questions like to whom our products are most valuable and what is the potential of our products? Mary Gorman shared a simple value equation with us
Value = Increased Revenue + Avoid Cost + Improved Service + risk
If we are to provide value to our customers, we have to concentrate on each of the factors above. Increasing the revenue stream can be through either creating new streams or protecting the existing ones. Software tools can play a role in avoiding cost through increasing operational efficiency to improve time to market. Improving the service could mean increased usability and accuracy. Coupling these factors with risk and dependencies and looking at the balance is very critical for providing a valuable product.
The role of partnership and collaboration is increasingly becoming important and this was once again stressed by Mary with her opinion of analysis is everyone’s job and is a continuous process. At IBM we believe that a collaborative requirements definition & management capability that is linked with other critical lifecycle disciplines such as test management are essential for success in agile business analysis, particularly where you may have large, distributed teams and/or regulated environments.
The webcast also discussed other agile business realities like the importance of discovering product needs just in time, structured conversations and confirmations through examples, prototypes and documentation. Watch the complete webcast here.
A lot have been talked about CLM 2012 and Rational Requirements Composer 4.0 since their release in June 2012. RRC 4.0 is now available for download from jazz.net.I am sure, most of you have already downloaded and started playing around with it.In this post, I will try to briefly touch base on the enhancements we made in RRC 4.0 and also provide resource links to download, demos and other tutorials.
Essentially the improvements in RRC 4.0 have been around
Combined Definition and Management - A clear and centralized requirements management eliminate redundancy and enable real time development.In RRC 4.0 we have moved the sketching, diagramming capabilities to the Web client to allow creation of rich text, diagrams, visual use cases, storyboards, and rich text based requirements. Life cycle Solutions & Collaboration - Improved Word/Excel migration and improved collaboration capabilities
Improved Planning & Visibility - Enhanced visibility using traceability across requirements, test and development.Visual and textual scenarios are now supported.Through the customized dashboard users will be able to know what is under work and can quickly navigate to information that is being traced, discussed, or under work at the present time. Data can also be organized in dynamic analysis views to help answer project questions and reports. For RRC 4.0 you can now use the RTC planning capabilities to help priorities requirements before they are defined. This capability allows requirements to be directly aligned with priority and planning.
Support for ReqPro data migration has been improved significantly. Stay tuned for another post on this topic. Meanwhile I encourage you to watch some videos that explains this in detail. The major improvement in RRC 4.0 is with respect to the improvements we have made in traceability. We have further expanded both requirements and life cycle traceability capabilities. A graphical traceability analysis and usage tool has been included in the latest version that enables you to graphically analyze the requirements in detail with enhanced filtering options.Suspect link analysis has also been improved significantly. A suspicion profile can be set and it allows to select the requirement types that you want to assess.This enables to take care of false positives, and discover changes across trace links among requirements and other life cycle elements. Watch the demos provided below to see how this works.
Some other feature level enhancements in RRC 4.0 are
As we mentioned in an earlier blog post, DOORS 9.4 and DOORS Web Access (DWA) 1.5 were released during Innovate 2012.This blog post provides insight into what’s changed in this release of DOORS and some of the significant new features. I have also provided a few resources where you can learn more about this release.
DOORS –RQM Integration based on OSLC
The most significant changes in DOORS 9.4 are the improvements to OSLC based integrations. A new integration based on OSLC has been provided for Rational Quality Manager (RQM). Let's see how it is different from the existing (RQMI) integration based on a point-to-point solution. Provided below is a simplified representation of how RQMI works for the DOORS-RQM integration and contrasts it with the new OSLC based integration.
As you can clearly see, the integration has been made so simple in terms of software and storage, yet more powerful. The new integration provides a stable architecture for future enhancements and provides an automated migration. If we consider the installation and configuration aspects, the new integration no longer requires the server and java client components.
What does this mean to a typical DOORS user? - This enables real-time lifecycle traceability to RQM test cases. This can be achieved through either the hover over menu from DOORS that display RQM artifacts or directly from RQM
What does this mean to a typical RQM user? - The real-time integration enables the RQM user to review and edit the automatically created draft test cases (with the new requirement reconciliation wizard) based on new requirements, and trace them back to DOORS. Full test coverage when requirements are changing is enabled with features like
Automatic display of requirements not covered by test cases in current test plan
Provision for linking existing test cases to new requirements
Display of modified and removed requirements
Enhanced suspect-ability analysis
Another important improvement in this release is the enhanced traceability to meet regulatory requirements. This enables
Linking of one or more requirements to each test step of a manual test script
Managing the association of requirements to related test cases
Display of links during test execution and in test case results
Apart from this, integration to Design Manager RSA and Rhapsody beta based on OSLC is also available in DOORS 9.4. However Design Manager is still in beta and this integration will be available only during later part of this year. For more details, visit jazz.net. Also the data exchange mechanism has been upgraded to the latest version of OMG (Object Management Group) ReqIF (Requirements Interchange Format) from RIF. This helps in improved communication of requirements between organizations in a supply chain. Support for data exchange and linked data between DOORS 9.x and DOORS Next Generation is also included. Reporting across DOORS 9.x and DOORS Next Generation is also included.
There are going to be some changes in licensing when we consider using DOORS with Rational Publishing Engine (RPE). We have removed the requirement for a RPE license while using RPE custom templates from directly within DOORS. But you still need a license for creating new custom templates; however it is not required to drive the reports.
We will briefly look at the usability improvements that have gone into DOORS 9.4. Many of these usability improvements reduce the need of writing custom DXL scripts. In DOORS 9.4, we have provided a stronger support to define and manage how more than one user can work on a module simultaneously. It is controlled with a widget allowing you to set a sharing level for editing as shown below.
The Views now support color coding and the user is allowed to control the background color of attributes. The views have been also extended to 128 columns.
Another small, yet significant usability improvement is the possibility to remove multiple views in a single selection. And finally DOORS now supports rich text exporting to Microsoft Excel.
We had a series of announcements earlier this month during Innovate 2012. The major announcements pertaining to Requirements Management space were RRC 4.0, DOORS 9.4 and DOORS Next Generation Beta. This blog post aims to dive a little deeper into these and provide a few helpful links...so read on!
Introducing Rational Requirements Composer 4.0
Collaborative Lifecycle Management (CLM) 2012 was announced on June 12 at the jazz.net website and Rational Requirements Composer (RRC) 4.0 continues to be one of the major pillars. CLM 2012 aims to provide the best integration experience available today in the industry between requirements, source control and quality management. For a detailed post on CLM 12, see Robin Garside's post in jazz.net.
In order to better the existing requirements management solution, RRC 4.0 comes up with a whole new set of improvements and features. The new and improved capability in this release will help project teams realize project change impact with downstream visibility using graphical and suspect traceability across requirements, test, and development items. Information access can now be controlled at a granular level - giving you the peace of mind knowing the exact requirements information that need to be modified. Project template upload and download as well as Requirements Interchange Format (ReqIF) are now available to give administrators and team leads easier control for multiple project environments. Additionally, project teams have greater cross project visibility through project dashboards. Some other significant highlights of RRC 4.0 are:
Enhanced enterprise deployment and scalability to support high availability and availability via clustering
Extended and refined data access control and automatic requirements identification
More solutions for analyzing traceability through graphical explorers and suspect link change identification
Extended CLM lifecycle integration for Rational Design Manager (Beta) to trace and report requirements and models/elements
Watch out for another blog post detailing these improvements in RRC 4.0 soon…
Introducing Rational DOORS 9.4
Systems space was equally active this month with lots of announcements from IBM Rational. With the help Open Services for Lifecycle Collaboration (OSLC), continued efforts have been put in unifying lifecycle disciplines across systems and software engineering. DOORS 9.4 and DOORS Web Access 1.5 are the latest offerings in Requirements Management for Systems space. DOORS 9.4 boasts of significant enhancements in server side and new integrations with Rational Quality Manager, Rational Software Architect Design Manager beta and DOORS Next Generation beta using OSLC.
Significant usability enhancements have been made for both rich and web clients. Customer templates designed using Rational Document Studio are now supported by DOORS 9.4. An additional bonanza of producing documentation from DOORS without the need for an additional license of Rational Publishing Engine is now possible. For a detailed account on the improvements in DOORS 9.4; visit the announcement letter here.
Download IBM Rational DOORS 9.4 here. Learn more about Rational DOORS here.
Stay tuned for a detailed post on What’s new in DOORS 9.4…..
DOORS Next Generation Beta
IBM DOORS Next Generation aims to be the next generation requirements management solution for complex software and systems engineering environments. Today, we are releasing latest version, Beta 3 of DOORS Next Generation and is available for download from jazz.net. DOORS Next Generation is built on the learnings of DOORS and extends the technologies of Rational Requirements Composer and DOORS 9. If you are considering how different DOORS Next Generation is from DOORS, read the article written by Richard Watson, Senior Product Manager for RM tools at IBM comparing the two products in jazz.net.
With this beta version, data import/export between DOORS Next Generation and DOORS 9.x projects are possible. Also bi-directional linking between the two products is supported. Development of DOORS Next Generation is kept transparent following the jazz vision and all developments can be tracked at Rational DOORS Next Generation section in jazz.net.
Our intended direction for developing DOORS family is to allow a smooth transition between DOORS and DOORS Next Generation. For each DOORS license entitlement that has active Subscription and Support, a customer will be able to use either DOORS V9 or next-generation capabilities. A commercial version of DOORS Next Generation is expected in the last quarter of this year.
There have been interesting thoughts on how requirements and should requirements be managed in agile projects. Some believe that requirements management is meaningless; some believe they are still a critical factor irrespective of the methodologies one follow. Here we bring to you an interview with Mary Gorman, an Agile Business Analysis Expert at EBG Consulting where she argues that business analysis is essential for agile success.
Mary Gorman, CBAP, CSM, is VP of quality and delivery at EBG Consulting, whose experts help deliver high-value products that delight customers. Mary works with global clients, speaks at industry conferences, and writes on requirements topics for the business analysis community. In addition to serving on the IIBA® Business Analysis Body of Knowledge® Committee for four years, Mary helped create the first certification exam for the Certified Business Analysis Professional™ (CBAP®).
We've heard comments like "you can throw away all that requirements and analysis stuff now we're going Agile." Have you come across this and what's your view?
It's a common misconception. In fact, requirements drive agile teams! At EBG Consulting we find when agile teams collaboratively analyze requirements, they can speed development and delivery of high value products. The ability to be focused, nimble and disciplined about your requirements is essential for successful agile delivery.
So how does business analysis change as you adopt Agile practices?
You plan and analyze regularly to support a steady flow of product delivery, a hallmark of successful agile teams. On agile projects, we plan to re-plan. A plan represents your allocation of requirements—really options for satisfying product needs—to delivery cycles. Rather than trying to acquire all the possible requirements upfront at the start and create one big plan, you plan continually, using feedback from prior deliveries to adjust your plan. This in turn means you are continually analyzing requirements to discover high value options for the next delivery. Planning and analysis are interdependent and synergistic. See this article for more details.
Agile analysis is done just-in-time—you want your requirements to be “fresh”. You adjust the precision and granularity of requirements taking a just-enough approach. You make use of good analysis tools and techniques. For example, you might sketch a context diagram to quickly visualize the interfaces needed for a release, a minimum marketable feature, a use case or a story. Or organically explore requirements using a data model or state diagram. For more ideas to tune analysis for agile, visit Agile Analysis Challenges
What about the role of the business analyst in Agile?
In our forthcoming book, my co-author, Ellen Gottesdiener (EBG’s founder and president), and I write about a product partnership that collaborates to discover and deliver valued products. The partners include diverse perspectives from the business, customer and technology communities. We have found this partnership is critical for agile product success. (The book’s title is Discover to Deliver: Agile Product Planning and Analysis. Read more about it here.
Often business analysts ask us where they ‘fit’ in this partnership. Our response is to ask, “What are your skills?” An agile team needs strong skills in analysis, modeling, elicitation, facilitation, risk analysis, prioritization, strategic thinking, verification and validation along with a sound understanding of the product’s domain. The person who possesses a combination of such skills will be a valuable and valued team member.
Who should attend the webcast on How Business Analysis is Essential to Agile Success and what will they gain from it?
The webinar’s content has broad applicability. It may benefit someone involved with planning, analysis, valuation, validation and verification; teams and organizations transitioning to agile; product champions and product owners who have the responsibility for making decisions about what product options to deliver, anyone on an agile team who recognizes that user stories, user story maps, personas are often not enough to communicate product needs. The webinar provides ideas for holistically exploring and evaluating product options within a framework the agile team can use to reach a shared understanding of high value product needs.
Mary will be joining with us on Wednesday, June 27, 2012 for a webinar where she will expand on the below discussions to build a case that business analysis is your key for
Discovering product options
Collaborating to create Agile plans
Conversing daily about what to deliver
Adapting your product with each delivery cycle to respond to changes in business needs
A replay will also be posted in case you missed the webinar on Wednesday. We believe this interview gave an insight into why business analysis is critical to agile projects. For more interactive discussion join us in the webinar.
What you think about requirements in agile? What level of requirements management/analysis do you follow in your agile projects?