I consider a requirements set good if and only the following condition is met:
The requirements set correctly represents the input > output control and data transformations of the system and their performance properties for all conditions, environments and operational uses of the system, without specifying internal design decisions.
Systems engineers hand off specifications to downstream engineering. Downstream engineering designs and implements correct realizations of the requirements. If the requirements meet that condition of goodness before they're handed off, you could avoid defects rather than having to fix them later (Tip #2).
How to meet the condition of goodness
Hygienic design, explained in hint #2 is the key to meet the condition of goodness. You must use principles, practices, tools, and methods to avoid disease (defects). Evidence suggests that the single best way to do that is to apply hygienic (i.e. verification) techniques during the creation of the work products, continuously, or at least approximately so.
To understand what this, verification needs to be defined clearly. There are two basic categories of verification: syntactic and semantic. Syntactic verification means the work product is well-formed and correct in form. For example, your project may have a requirements standard that says:
- All requirements are uniquely numbered.
- Each functional requirement uses the word shall to indicate the normative part of the statement.
- All functional requirements are modified by at least one quality of service requirements.
This is not the same as requirements that say correct things. It is only that the statements made comply with a set of well-formedness rules. Verification of syntactic correctness is usually performed by members of the quality assurance team and may be performed either manually or automatically.
Semantic verification means that the statements are correct, consistent, and accurate in content. Assuring semantic correctness is more commonly what is meant by the unqualified term "verification." Subject matter experts, mathematicians, or testers do the semantic correctness.
There are three primary methods for semantic verification of system work products:
- Formal (mathematical) analysis
Semantic review is the weakest form of semantic verification. The work product is reviewed by one or more peers or subject matter experts looking for defects and both internal and external inconsistencies. Human cognition is wonderfully effective in narrow focused analyses, but falters when the work product is large and/or complex. The issue becomes one of vigilance rather than intelligence. It is difficult to remain vigilant while spending tens or hundreds of hours examining the minute interconnected details of complex systems. Another problem with reviews is the cost. Having multiple people in a room reviewing a work product means that there is a high engineering overhead for review time. It is not uncommon to have a dozen or more people involved in a review. This means you are paying a dozen times the cost per unit time than you would spend on other means of verification.
Even in the best of times, semantic reviews are weaker than other types of verification. They do add value as a supplement. The issue I see in many companies is that this is the only verification means applied to most system engineering work products.
Formal analysis has the advantages of eliminating personal opinion and providing rigorous analysis of system aspects such as safety, reliability, and robustness simultaneously. Formal methods can be applied at a variety of levels, from determining that all states of a system are reachable in finite time through formal theorem proving (e.g. "An elevator will arrive in response to a request within 1 minute."). One of the strengths of formal methods is that it can verify conditions regardless of execution paths. It can verify outcomes for all possible circumstance, something that cannot, even in principle, be done with testing. The primary difficulty with the application of formal methods is the very same degree of mathematical rigor that provides the value of the approach also makes it unusable by the vast majority of engineers. Further, as with all analyses, the outcome is only valid in the context of its axiomatic preconditions and class invariants (assumptions). Unless you have someone on staff with a PhD in formal methods, you should be resigned to using the lighter-weight techniques of formal methods.
Having said that, there is tool automation that supports some of the application of formal methods. The Automatic Test Generator (ATG) tool for IBM® Rational® Rhapsody®, for example, can produce a set of test vectors that visit all possible states of a state machine or identify that some states are not reachable.
It is impractical to expect formal methods to provide the bulk of semantic verification. As with reviews, it remains a useful addition to other verification means.
Testing remains the most practical means for the semantic verification of work products such as specifications, designs, implementations, and resulting systems. Testing is done by defining a set of test cases. Each test case is composed of a set of events and specific data values. These occur in a specific order and timing – with a known-to-be-correct outcome.
An issue with testing, in principle, is that it only verifies the specific paths and data included in the test case. Other sequences of events and other data values are not verified. Because intelligent systems can achieve an essentially infinite set of states with an essentially infinite set of data values, you cannot exhaustively verify a system through testing.
A number of standards call out minimal levels of software test coverage. These include:
- Structure coverage
- All lines of code are exercised by the test set
- Decision coverage
- All Boolean decisions branches are covered in both true and false composite condition)
- Modified condition/decision coverage (MCDC)
- All Boolean phrases must independently assume their true and false cases at all decision points
- Data equivalency set coverage
- Data is classified into equivalency groups such that the behavior of the system is qualitatively the same for all values within the set, and the test set must include data from all equivalency sets
For semantic verification, the best approach is a combination of all three methods. You can rely primarily on testing, but make sure you augment testing with reviews and formal analyses.
The hygienic approach proposed in this article is to apply verification techniques continuously as the work product is developed. Figure 1 shows the development of requirements models. In Figure 1, you can see the places where verification is performed. Notice that the inner loop (from Define the Use Case System Context down to Verify and Validate the Functional Requirements and back) is a nanocycle and is run every 20-60 minutes. So you take some small set of requirements, realize them in the model, execute and verify them, and repeat. This continues until all of the requirements bound to the use case are expressed and verified in the model.
Figure 1. Test-driven development-high-fidelity modeling workflow for requirements analysis
As you execute this requirements analysis nanocycle, you apply test verification at the last step of this very short loop (Verify and Validate the Functional Requirements). You may also use formal methods to develop test cases to ensure you have covered the state space expressed in the use case model. You might even define a theorem about the system invariants and apply mathematical analysis to verify that theorem. You also validate the model during some of these cycles by demonstrating the model execution, along with any created or modified requirements (see the Update and Maintain Requirements task), to the customer to ensure that your understanding and expression of the requirements result in a system that meets their needs. At the end of the development of this (relatively small) work effort defining the use case, there is the review step (see the Perform Review task) where the stable and verified use case specification and associated textual requirements are verified using both semantic and syntactic review process.
Because these verification method are applied in place during the development activities, you can, for the most part, avoid defects that are only uncovered during a rigorous examination. This improves the quality and vastly reduces the rework typical in a system engineering environment in which verification comes late in the process.
There you have it – my top ten hints for model-based system engineering. This is not an exhaustive list of how to perform systems engineering but hopefully, it provides some help. As an industry, we certainly can use it.
- To learn more about the tool for collaborative, model-driven development for embedded systems, start with the Rational Rhapsody product line overview and the Rational Rhapsody page on IBM developerWorks. Also see the Rational Rhapsody 7.6 information center and the Changing the location of help content to get a local copy of the documentation.
- Explore the various versions, too: IBM Rational Rhapsody Architect for Software, a visual development environment for embedded systems and software
- IBM Rational Rhapsody Architect for Systems Engineers
- IBM Rational Rhapsody Designer for Systems Engineers
- IBM Rational Rhapsody Developer for collaborative, model-driven development of embedded systems. This edition is required for Eclipse users, and editions are available to create specialized projects in C, C++, Java, and Ada languages.
- To learn more about Rational Rhapsody design management capabilities, check out the IBM Rational Rhapsody Design Manager to see how to collaborate, share, review, and manage designs and models with the entire engineering team. Also see the Design Management page on Jazz.net.
- Explore the Rational software area on developerWorks for technical resources, best practices, and information about Rational collaborative and integrated solutions for software and systems delivery.
- Stay current with developerWorks technical events and webcasts focused on a variety of IBM products and IT industry topics.
- Improve your skills. Check the Rational training and certification catalog, which includes many types of courses on a wide range of topics. You can take some of them anywhere, anytime, and many of the Getting Started ones are free.
Get products and technologies
- Take an online tour of Rational Rhapsody with an online trial or download Rational Rhapsody or special editions to Evaluate, free of charge, for 30 days.
- Evaluate IBM software in the way that suits you best: Download it for a trial, try it online, or use it in a cloud environment.
- Participate in the Enterprise Architecture and Business Architecture forum, where you can share information about methods, frameworks, and tool implementations. Discussions include tool-specific technical discussions about Rational System Architect.
- Join the discussion in the Rational Rhapsody forum.
- Get connected with your peers and keep up on the latest information in the Rational community.
- Rate or review Rational software. It's quick and easy.
- Share your knowledge and help others who use Rational software by writing a developerWorks article. Find out what makes a good developerWorks article and how to proceed.
- Follow Rational software on Facebook, Twitter (@ibmrational), and YouTube, and add your comments and requests.
- Ask and answer questions and increase your expertise when you get involved in the Rational forums, cafés, and wikis.
Dig deeper into Rational software on developerWorks
Get samples, articles, product docs, and community resources to help build, deploy, and manage your cloud apps.
Experiment with new directions in software development.
Software development in the cloud. Register today to create a project.
Evaluate IBM software and solutions, and transform challenges into opportunities.