Component testing with IBM Rational Application Developer for WebSphere Software

Subtitle

Component testing doesn't have to be a costly and difficult process. By reading this step-by-step guide, you'll learn how to automate the creation and deployment of component tests efficiently and cost effectively.

Michael Kelly (Mike@MichaelDKelly.com), Senior consultant, Fusion Alliance

Michael Kelly is currently a senior consultant for Fusion Alliance in Indianapolis. He's had experience managing a software automation testing team and has been working with the Rational tools since 1999. His primary areas of interest are software development life cycles, software test automation, and project management. Mike can be reached by e-mail.



Allen Stoker (Allen.Stoker@PortalWizard.com), Technologist, Liberty Mutual

Allen Stoker is a technologist with Liberty Mutual in the Regional Agency Markets (RAM) division. He has been implementing systems on a variety of platforms including IBM mainframe, Microsoft Windows, and J2EE since 1983 and today focuses primarily on quality-driven development in the J2EE environment. He was recently responsible for driving the implementation of unit-testing practices across the RAM organization. Allen can be reached by e-mail.



08 March 2005 (First published 08 March 2005)

"Component testing is the act of subdividing an object-oriented software system into units of particular granularity, applying stimuli to the component's interface and validating the correct responses to those stimuli, in the form of either a state change or reaction in the component, or elsewhere in the system," according to Michael Silverstein in his article "Component Testing with Intelligent Test Artifacts." If you've ever attempted it, you know that as beneficial as component testing can be in helping to find problems early, it can also be so costly and problematic that testers may wonder whether it's really worth the effort. But it's been our experience that with the right tools, project environment, and understanding of the test patterns associated with it, component testing can be both successful and cost effective. In this article, we'll look at component testing with IBM® Rational® Application Developer for WebSphere Software (IRAD). IRAD assists developers by automating the creation and deployment of host and target-based component test harnesses, test stubs, and test drivers.

This article is intended for developers who are familiar with the basic concept of component testing, have perhaps done some in the past using JUnit or some other framework, and are looking for a faster way to create more powerful tests in the new IRAD development environment. After saying a little about IRAD, we'll discuss which components to test. Then we'll walk through an example of implementing a test of a Java™component in IRAD. We'll mention some of the features we found most useful in creating our tests, and some frustrations we had and how we solved them. Finally, we'll suggest how you might leverage these types of tests across the project team.

Editor's Note: This article was written using Version 6.0.0 of IBM Rational Application Developer for WebSphere Software on Microsoft Windows XP Home and Professional editions.

A little about Rational Application Developer

The automated component testing features in IRAD allow you to create, edit, deploy, and run automated tests of Java components, Enterprise JavaBeans™ (EJB™) components, and Web services. With IRAD you can do the following:

  • Work with industry-standard JUnit test scripts.
  • Create test projects for Java components, EJB components, and Web services.
  • Create and edit tests and test data.
  • Use stub classes to replace dependency classes for Java components, EJB components, and Web services.
  • Deploy and run your tests.
  • Analyze test results.
  • Run regression tests.

Component tests are most commonly created and executed by developers. Using IRAD, you can develop simple tests for Java classes or more complicated integration tests by identifying a flow through multiple classes. In this article we'll look at an example of creating tests for a single class using IRAD.


Which components should I test?

In his paper entitled "Component Testing," John D. McGregor offers this axiom: "Select a component for testing when the penalty for the component not working is greater than the effort required to test it." John goes on to say that not every class needs to be tested independently -- only the classes that are large enough, important enough, or complex enough to meet the condition in the axiom.

Here are some further thoughts on some categories of components that you might consider testing:

  • Work with industry-standard JUnit test scripts.
  • Reusable components -- Components intended for reuse should be tested over a wider range of values than a component intended for a single focused use.
  • Domain components -- Components that represent significant domain concepts should be tested both for correctness and for the faithfulness of the representation.
  • Commercial components -- Components that will be sold as individual products should be tested not only as reusable components but also as potential sources of liability.

IRAD offers detailed static metrics on the component under test that can help you decide how to prioritize your tests. Static metrics that analyze component architecture, component complexity, and test coverage are displayed in the Create Java Component Test wizard. In the wizard, you can sort, hide, and show the data as an aid to determining the best test strategy.

  • Architectural metrics (dependency level, fan in, fan out, and ext user) measure the complexity of relationships such as method calls, inheritance, or variable usage.
  • Component complexity metrics (attributes, methods, statements, nesting level, and V(g), the cyclomatic complexity number) measure the complexity of the control flow of the source code.
  • Test coverage metrics (line and tests) indicate the percentage of lines of code covered and the number of tests directly using a method of the class, to help you decide whether the code needs further testing.

If you use the static metrics to help you decide what to test, you may find these suggestions from the IRAD help files to be helpful:

  • Concentrate on testing components that will provide the highest coverage rate. The fan-out metric represents the number of uses of methods or attributes that are defined outside the class. This is a good indication of classes that have a high impact on coverage rate. Testing classes with a high fan-out score without any stubbing will allow you to test a large percentage of the code quickly. On the other hand, strict unit testing on those classes (using stubs to isolate components) will require a lot of work to create many stubs.
  • Concentrate on testing components that are functionally critical. Look at metrics such as fan in, which is the number of public attributes plus the number of public methods in the class, or ext user, which indicates the number of external components that use attributes or methods of the class. The higher these values, the greater the risk of regression if any changes are made to the class. These classes should therefore be thoroughly tested.
  • Concentrate on the most complex components. Complexity indicators are mainly the cyclomatic complexity number V(g) and the number of statements in the code. Typically, V(g) varies between 1 and 10, where a value of 1 means the code has no branching.
  • Even if you test all of the methods of the classes individually, be sure to define class-level tests. This doesn't necessarily mean that you need to test each individual class. For example, if you have a few classes that are so tightly coupled that to test one class you need to stub all the others, you might consider testing a small cluster of from three to ten classes together.
  • Identify the subsystems or large clusters that should be tested as a unit. A subsystem should be tested as a unit when it meets either of the following criteria: (1) you have classes with interdependencies that you need to deliver to another developer, or (2) you have classes interacting together and with other classes that you've stubbed during your class-level testing.
  • Use the level indicator to evaluate the dependency level of a class in the call graph of the application.

Once you've performed a first series of tests, the line coverage rate (line) and the number of tests that apply to a component (tests) allow you to identify any components that haven't yet been sufficiently covered by previous tests.


Our simple example class

Now let's begin our example by creating a simple class that we can exercise in our test scenarios. Create a Java project, then create a package named example and create a class like this:

package example;

public class TestableExample {
      public String getDataValue(String data) {
            return(data);
      }
}

This class will take any string and return the same string back to us. This will allow us to easily test both good and bad test cases and see the results.


Creating a component test project

Now that we have something to test, let's start creating the test cases by creating a component test project. This project will hold our test artifacts.

  1. Open the Navigator view and right-click the view pane.
  2. Select New > Project, then check the "Show All Wizards" box and select Component Test > Component Test Project.
  3. If you receive the Confirm Enablement dialog, click OK to enable the component test capability.
  4. Enter a project name on the Create Component Test Project page and click Next.
  5. Select the Java project as the test scope and click Finish.
  6. When you're prompted to change to the Test perspective, respond by clicking Yes.

When you create the component test project, the Java build path (which can be found in the project properties, as shown in Figure 1) will include the component test libraries. These libraries include the JUnit and IBM component test support JAR files. If you experience class path errors at any point while working through these examples, verify that these libraries are configured properly.

Figure 1. The project properties window, showing a sample Java build path
The project properties window, showing a sample Java build path

Creating a new component test

Within the project we just created, let's create a new component test.

  1. Right-click in the Test Navigator pane and select New > Component Test > Java Component Test.
  2. Click Next to confirm the target of your component test project. This displays the Create Java Component Test wizard, mentioned earlier. (Note that in our screenshot we show all of the available metrics options. By default you won't see all of these columns. You can change these settings by clicking Options... and selecting any of the metrics you find interesting.)
Create Java Component Test wizard
The Create Java Component Test wizard, where we select the components under test
  1. In the Create Java Component Test wizard, select the TestableExample class we created and click Next. Any components selected here will be included in the generated test case.
  2. Select "Method-level testing" and click Next.
  3. Select our example class. Test cases will be built for all methods that you select here.
  4. Click Finish. The overview screen will be displayed.

The source code of the test case is called a behavior in IRAD. The overview screen contains a link on the bottom left that will take you to the source. You can also right-click a test in the Test Navigator pane and select Open Behavior at any time to get to your source code. For the test we just created, the generated test case code will look like this:

package test;
import junit.framework.*;
import example.TestableExample;
import com.ibm.rational.test.ct.execution.runtime.ComponentTest;

public class TestableExampleTest extends TestCase {
public void test_getDataValue() {
TestableExample objTestableExample = new TestableExample();
String data = "";
String retValue = "";
retValue = objTestableExample.getDataValue(data);
    }
}

A new item will appear in Test Navigator under the Test Suite folder in your component test project. This item will be named for your class with "Test" appended. If you expand this item, you'll find a node representing the getDataValue method. If you had selected additional methods from a larger class, you would see each method appear as a node in the test.


Running a component test

Let's go ahead and run the test that was just created.

  1. Right-click the test suite (TestableExampleTest) and select Run > Component Test.

You'll see the background execution status in the lower right corner of the Test window. When the test is complete, you'll see a new node in the Run folder of the component test project. The new node is named after the component under test and includes the word "Configuration" and a date and time stamp in the name, as shown in Figure 2.

Figure 2. The Test window, showing a new node in the Run folder
The Test window, showing a new node in the Run folder

If you drill down into the test execution, you should find that the test completed successfully. You should also see a new view for the Test Data Comparator. This view shows you the actual tests executed and the results of each test. In this case, you should see that each test passed. There are two tests displayed; the first represents the successful execution of the constructor for our class, and the second is the execution of the actual method call. The only real information we can derive from this test is that none of our code threw any exceptions. We still haven't really exercised our code yet.


Extending the framework with test data

It's great that we can execute the method, but we didn't really prove anything because we didn't validate the output. Our simple testable class will return whatever string we pass in, so let's start evaluating the results. If we were coding raw JUnit classes we would write something like this:

assertEquals("a", objTestableExample.getDataValue("a"));
assertEquals("a", objTestableExample.getDataValue("b"));

A couple of obvious issues exist with this example. First, the data values are hard-coded in the application. Second, each data validation point requires code modification. This is where the Test Data Table (TDT) -- a feature in IBM Rational Application Developer for WebSphere Software--can save us a lot of time and effort.

  1. Select the TestableExampleTest.java file. If it's not already visible, right-click the test item in the Test Navigator pane and select Open Behavior.
  2. Position the cursor within the body of the getDataValue method, and the TDT will display in the TDT pane.

Each test method has its own unique TDT. If you have multiple methods in your test case, you'll find that clicking within any method will force a redisplay of the TDT to match the context of the current code. In the TDT you'll see two test action blocks with nested data. Each of these represents a testable event. Each time you save the source file, you invoke a parsing of the test class. The TDT automatically recognizes changes in variables and assignments and creates test actions to match.

  1. Add these lines of code to the test behavior class:
String x = "";
String y = "";
x = String.valueOf(y);
  1. Click Save, and you'll notice that there's a new block in the TDT representing the assignment x = String.valueOf(y).

As you can see, the TDT represents an incredibly fine-grained representation of the test case, allowing you to manipulate the data in a variety of ways. We'll leave the code in this updated state as we explore the complexities of the TDT.

  1. Click inside the method block so that the TDT context will be aligned.

If you see the TDT completely highlighted, you're in a state of flux where you've made changes but not saved them. This has a nice warning effect as long as you understand why it's happening.

On the right side of the TDT pane you'll see a Test Data group with two columns: "In" and "Expected." This is where we'll define our input parameters and the expected result.

  1. Right-click in the "In" column on the data row of the getDataValue validation.
  2. Select Define Simple to edit the column. If you're already in edit mode for that column, you'll see an alternate pop-up menu.
The Test Data Table
The Test Data Table and the pop-up menu offering the selection Define Simple
  1. Enter a string value "a" in the column and be sure to include the quotes.
  2. Move to the "retValue" row in the "Expected" column and enter a string value "a" there as well.
  3. Proceed to the String.valueOf(y) validation and enter the string value "1" in the "y" row of the "In" column and the "x" row of the "Expected" column.

As you modify the TDT you'll see the small disk icon enabled. To save the entered data, click the icon or press Ctrl-S. Be sure to perform this step each time you change the data.


Running the modified component test

Now let's execute the test again, this time with test data loaded.

  1. Right-click the test suite (TestableExampleTest) and select Run > Component Test.
The test window
The Test window, showing Run > Component Test selected and the test data loaded in the Test Data Table

You'll again see the background execution status in the lower right corner of the Test window. When the test is complete, you'll see a new node in the Run folder of the component test project. The resulting Test Data Comparator (TDC) pane should look like the one shown in Figure 3.

Figure 3. Execution results with data in the Test Data Comparator
The Test comparator window, showing the execution results

You may notice that the Test Navigator still represents the test results in the Run folder as Test Data > Individual Test #0. But also notice that the Test Data Comparator now shows expected and actual values for our tests. This time we actually validated that our component did something that we expected it to do.


Doing more with test data

Let's explore some more complex data scenarios and see what the impact is.

  1. Click in the source code to retrieve the TDT for our test case.
  2. Right-click in each of the data value columns and select Define Set. (Note: Be sure that the TDT pane is large enough to show the entire set definition pop-up window. This initially gave us some problems.)
  3. Change the "a" values to "a" and "b", entering one at a time and clicking the Add button.
  4. Continue to update all four columns, changing the "1" values to "1" and "2". The Test Data Table will be populated with these new values.
The Test Data Table
The Test Data Table with two new data sets defined
  1. Run the test again, and expand the Run results when complete. You should now see nodes for individual tests #0 through #3.
The Test Navigator pane
The Test Navigator pane, showing the new tests in the Run folder

Why were four tests run when I only had two sets defined? you ask. How perceptive of you! Because the TDT was extrapolated into four distinct scenarios--a/1, a/2, b/1, and b/2. Each of these data combinations results in a separate test execution. Click through each test to see the specific results of each of these scenarios.


Doing even more with test data

Now let's really juice this up! You may have noticed the Test Data node above the test results. The node is taken directly from the group in the Test Data Table. We can change this group and add more groups to create tests that are even more sophisticated.

  1. Go back to the TDT (click in the source code).
  2. Right-click the title bar block that contains the heading "Test Data."
  3. Select Rename Data Set and change the name to "Pattern 1." You'll notice that the title in the bar changes.
  4. Right-click again and select Add Data Set.
  5. Scroll right to the "New Test Data" title and rename it "Pattern 2."
  6. In the "Pattern 2" columns, change the "In" and "Expected" values to "c" for getDataValue and "3" for String.valueOf(y).
The Test Data Table
The Test Data Table with multiple test groups
  1. Go ahead and run the test again.

When execution is finished, you'll see in the Test Navigator pane that the results are broken down by data set (see Figure 4). This can be incredibly useful if your names are a little more meaningful than Pattern 1 and Pattern 2.

Figure 4. Results broken down by data set in the Test Navigator pane
The Test Navigator pane, with test results broken down into Pattern 1 and Pattern 2 data sets

Exporting your test results

Test execution results can be exported as an HTML report. This allows you to make results available to the entire project team, parse results, e-mail them, or do any number of other useful things with them.

  1. Select File > Export > Component test execution results, and then click Next. This will display the Export Results wizard.
The Export Results wizard
The Export Results wizard
  1. Select the test runs you want to include in the report.
  2. Select a target folder and options for the report.
  3. Click Finish.

The report is generated as HTML in the target folder. If you select the "Only one HTML file" option, the resulting report will be one HTML file with several supporting graphic icon files in the target folder. If you don't select that option, the default format builds a report file that summarizes the selected runs and creates individual HTML files for each selected execution linked from the summary page. The details of the report replicate the Test Data Comparator pane in an HTML format.


Importing existing JUnit test cases

As mentioned before, the component test facility is an extension of the JUnit framework. As such, you can run any existing JUnit test cases within the component test facility, if you import them into a component test project and as long as the JUnit source code is within your workspace (a restriction that's easy enough to work around). Note that while developing this article, as we switched between versions of IRAD we experienced some class path issues due to differences between the products. If you're experiencing problems with your import, double-check the libraries included in your project.

Let's create a package named test in our original Java project, build a JUnit class there named ImportedTestCase, and replace the existing code with the following source code:

package test;
import junit.framework.*;
import java.lang.String;
public class ImportedTestCase extends TestCase {

public void testSomething() {
String value1 = "a";
String value2 = "a";
String value3 = "b";
String value4 = "b";
assertEquals(value1, value2);
assertEquals(value3, value4);
    }

}

Then we'll import the test into the component test project. The Test Navigator doesn't show Java source files, so you'll need to change your view or perspective appropriately.

  1. Select File > Import.
  2. Select "JUnit test into component test project" as the import source.
  3. Click Next in the Select a Test Project window.
  4. Select the test case from the Java project and click Finish.
The Import JUnit Test dialog
Selecting the test case from the Java project in the Import JUnit Test dialog

If you were looking at the JUnit Java source code before the import, you may still be looking at the same file after the import. During our tests the perspective didn't change and no files were opened or closed. This is a little misleading since the source code has been converted and now exists under our component test project.

Be sure to close the Java source file, then right-click on the new test case and select Open Behavior to access the new source file. After the import, you'll need to validate the Test Data Table contents. In our test, the source code was appropriately converted, but some of the data values weren't fully populated. The primary difference is that the assertEquals methods have been replaced by the ComponentTest.check method, as you can see in Figure 5.

Figure 5. The JUnit test case after import
Our JUnit test case as it shows up in the Test window after import

You're now free to update any data values or change the nature of the test case as we did previously.


A word of caution

While we were messing around with IRAD, Allen noticed that test data is all managed in a binary file paired with your Java test source file. We see two potential issues here:

  1. There's no easy way to import or export this data. While the code is still within the JUnit framework, the data table and code are disassociated and are managed entirely by IRAD.
  2. If you're working in a team environment, you'll need to obtain an exclusive lock on the data file (.nextsuitebehavior) since you won't be able to merge the file in your version control system.

While we both agree that these issues wouldn't prevent us from using the tool, we thought they were worth pointing out.


Testers and developers using IRAD together

We were surprised by how much of a learning experience writing this article was for both of us. From a testing perspective, Mike didn't know that developers actually have the ability to create complex component-test scenarios, select sets and ranges of data for input, or have ready access to the static metrics discussed above. From a development perspective, Allen was surprised when we started talking about data selection techniques and how testers use concepts like boundary-value analysis, domain partitioning, and equivalency class partitioning to select data sets and ranges for optimal test case coverage.

In that light, we'd recommend that testers and developers work together when developing component tests. When we say "work together," we don't necessarily mean work side by side when you develop the tests (although there wouldn't be anything wrong with doing that). Instead, we're envisioning a scenario where developers develop component tests on their own and then work with testers to define the test data to be used in the test. Data can be entered directly by the tester, thus extending the component test, or the tester can send comments on data selection to the developer and the developer can enter the data.

Working this way can result in the following advantages:

  1. Increased appreciation and understanding of skill sets
  2. Cross-training and knowledge dissemination
  3. Better and more meaningful component tests
  4. Possible reuse of component test data for functional testing downstream in the project lifecycle
  5. Better coordination of all testing efforts based on coverage statistics and other static metrics
  6. Better software

Resources

Comments

developerWorks: Sign in

Required fields are indicated with an asterisk (*).


Need an IBM ID?
Forgot your IBM ID?


Forgot your password?
Change your password

By clicking Submit, you agree to the developerWorks terms of use.

 


The first time you sign into developerWorks, a profile is created for you. Information in your profile (your name, country/region, and company name) is displayed to the public and will accompany any content you post, unless you opt to hide your company name. You may update your IBM account at any time.

All information submitted is secure.

Choose your display name



The first time you sign in to developerWorks, a profile is created for you, so you need to choose a display name. Your display name accompanies the content you post on developerWorks.

Please choose a display name between 3-31 characters. Your display name must be unique in the developerWorks community and should not be your email address for privacy reasons.

Required fields are indicated with an asterisk (*).

(Must be between 3 – 31 characters.)

By clicking Submit, you agree to the developerWorks terms of use.

 


All information submitted is secure.

Dig deeper into Rational software on developerWorks


static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Rational, Architecture
ArticleID=95745
ArticleTitle=Component testing with IBM Rational Application Developer for WebSphere Software
publish-date=03082005