Introducing a framework for generic, productive, reliable, and maintenance-free unit tests Unit testing is as an integral part of extreme programming, and many open source tools and frameworks help developers write unit tests. IDE plug-ins can create skeleton unit test cases, and Ant tasks and Maven goals automatically run test cases and generate reports during continuous integration.However, unit test cases are not generic. Any functional method handles multiple data scenarios. We write one test method for every data scenario because we create the test data, fire the test, and validate the output in the same test method. If a new requirement adds additional data scenarios for the method, then we end up writing more test methods. Thus, test-case maintenance requires effort. The complexity of tests further increases when testing server-side components against data that changes during transactions. In addition, we must ensure that the tests are always correct. All of these issues require considerable time to address and increase the complexity of the test cases. Overall, we yearn for something that will remove the complications involved in writing test cases and provide a generic set of test cases that are free from maintenance.This article first outlines a comprehensive list of issues faced during unit testing and then details the creation of a testing framework that facilitates the writing of generic and configurable unit test cases by integrating with multiple open source testing tools and frameworks. In this article, we refer to the JUnit, JUnitPerf, Cactus, JUnitEE, and DbUnit frameworks and tools such as Ant, CruiseControl, and XStream. Please note that this article is not a tutorial for these frameworks and tools. Issues in unit testingHere is the summary of the issues that must be addressed while unit testing:The effort required to create/maintain unit tests: We create test data within the test method’s body. Thus, to have robust unit testing, we must create different test methods for every possible combination of data for that functional method. Let’s call combinations of data data scenarios. We do not generalize data creation and output assertion. Hence, we end up writing test methods for every data scenario that needs testing, which causes maintenance issues. If we change the method to handle more data scenarios, we end up writing more test methods. Also, changing the data for a scenario isn’t always straightforward since the test data is embedded in the code and we must change the test-case code and rebuild the entire test suite.Maintaining data consistency: We also need to maintain data consistency for the methods that complete database transactions. For example, if a method creates a customer in the database and we try to create the same customer again, the unit test case will fail.Providing the same approach to writing unit tests but leveraging high-quality tools and frameworks: Some JUnit tools/frameworks available in the open source community enhance unit testing. One such tool is JUnitPerf. It allows basic performance testing of the code. However, it has its own approach to creating test cases and running them. It needs information about the load and response time to create tests. This information is hard-coded in the test methods and the developer ends up writing more unit test cases. Abstracting that information out of the test case so the same unit test case can be run as a load test or response-time test without the developer knowing the intricacies of JUnitPerf would prove beneficial.Complexity of test cases when testing server-side components: When testing server-side components, we need Java Enterprise Edition (JEE) features—e.g., JNDI (Java Naming and Directory Interface) lookup and an EJB (Enterprise JavaBeans) container—which further adds to the complexity of test cases. Tools are available for server-side testing, with the best examples being Cactus and JUnitEE. If we want to leverage the advantages of both, we must glue them together and hide their configuration and intricacies from the developer. We must provide a simple way for the developer to do server-side testing.Encouraging maximum test effectiveness (all the data scenarios are tested): Unit testing is a critical element of continuous integration. During continuous integration, we can use unit testing to measure the build progress. That is, if the unit test reports show a 50-percent success rate, then we should be able to get an idea of the overall build progress. To achieve such evaluation of build progress, the unit tests should be functionally correct and reliable.Correct tracking of the build phase: Most unit test code follows the same pattern. On a higher level, the code looks repetitive. Abstracting such repetitive code from the test cases minimizes the effort required to write unit test cases.These issues can be resolved by writing a unit test framework that reduces testing effort, by making test cases generic and configurable, and provides seamless integration with excellent open source unit test frameworks. The building blocks of the framework are:Test typesData abstraction, initialization, and assertion techniquesInfrastructure support (logging, JNDI lookup, property reader, XML binding, etc.)DbUnit (for database consistency), JUnitEE, and Cactus (for server-side testing), and JUnitPerfBuilt-in tools (for creating configurations and test skeletons)We will discuss each building block in subsequent sections. We call our framework Zing. For this article’s sample application, we use the two classes that appear in the code below. The sample code and Zing code are available for download from Resources. package sample.currency;public class Money{ private int amount; private String currency; public Money(int amount, String currency){ this.amount = amount; this.currency = currency; } // Getters and setters }package sample.currency; public class MoneyUtils{ public Money addMoney(Money m1, Money m2){ if (m1 == null || m2 == null){ throw new IllegalArgumentException("Arguments can't be null"); } else if(!(m1.getCurrency().equals(m2.getCurrency()))){ throw new IllegalArgumentException("Arguments should be in same currency"); } return new Money(m1.getAmount() + m2.getAmount(), m1.getCurrency()) } } Test typesTo integrate multiple unit test frameworks and provide a simple interface for the developer to write test cases, we need to classify test cases into multiple types. These test types form the basis of the Zing framework, allowing integration among multiple unit test frameworks. These types form abstract classes in the Zing framework.In this article, we discuss the following types: Simple test: A simple test case that extends from JUnit’s TestCaseServlet test: A test case that extends from Cactus’s ServletTestCasePerfLoadTest: A performance test case that extends from JUnitPerf’s LoadTestPerfTimedTest: A performance test case that extends from JUnitPerf’s TimedTestAll the above mentioned test cases must provide some common functionality to integrate them together. We can achieve such functionality by creating a common interface, zing.tests.ITest. We have to extend our test case either from SimpleTest or ServletTest and write the test code. Depending on the test-case configuration, PerfLoadTest and PerfTimedTest are created while the test runs. Using these test types, we abstract the complexity of using different test frameworks to one single place so we can concentrate on writing test code instead of worrying about different implementations for different test frameworks. This approach also enables us to plug in more unit-testing frameworks.For example, the test case for MoneyUtils looks like: public class MoneyUtilsTests extends zing.tests.SimpleTest{ ... } Data abstraction, initialization, and assertionNext, we concentrate on abstracting all the data out of the test code. Unit testing deals with three categories of data: Test data: Input and output dataConfiguration dataDatabase dataTest dataWe abstract the test data out (input and output objects) of the test-case code to make it more generic. We can extract this data in XML. For every data scenario, we create XML files for input and output objects (including exceptions). For testing MoneyUtils‘s addMoney() method, we need input objects m1 and m2, and one output object of type Money to assert this method’s success. Instead of hard coding the object values in the test-case code, we create an XML representation of the required instances of Money. So we create three XML files, one for each instance (the generation of these files can be automated using XDoclet annotation). The XML file for m1 looks like: <sample.currency.Money> <amount>20</iName> <currency>dollar</currency> </sample.currency.Money> The file for m2 looks like: <sample.currency.Money> <amount>30</iName> <currency>dollar</currency> </sample.currency.Money> The file for output looks like: <sample.currency.Money> <amount>50</amount> <currency>dollar</currency> </sample.currency.Money> We just need to create the XML input and output files instead of writing new methods for testing new scenarios of the same method. The best way to abstract data in an XML file is by using an XML serializer, such as the open source XStream. XStream serializes XML files to Java objects and vice versa. This way, we can store input and output values for multiple scenarios that can run across the same test method.If we want to write generic code that reads these XML files and converts them to Money instances, we must follow certain naming conventions for the file names. We use the following naming convention: __.xml. Hence, for m1, the name is testAddMoney_DEFAULT_SCENARIO_m1.xml. Adhering to a common naming convention makes it possible to write a utility class to read the data from XML files for a given scenario of a given test method and feed that data to a test method. In later sections, we explain how to automatically create these input and output XML files with a utility class.We still require a mechanism to configure the same test method to run multiple times for different data scenarios. We achieve that by abstracting configuration data out of the Java code. Configuration dataOur aim is to write only one test method for one functional method, but test it for all possible data scenarios. To achieve this, we also need a way to configure the same test multiple times. In addition, we may need optional configuration to apply test decorators such as load test and response-time test. Such configurable tests can be achieved by following these steps:Create a configuration schemaUse XML binding (Castor, XMLBeans) to read the configurationCreate a utility class to create tests from the configurationCreate a suite class that can assemble all the tests, and initialize and run themPlease note that the test framework handles everything except for the schema. Developers don’t have to worry about it. A sample configuration schema looks like: <test-suite name="zing-suite"> <test-case name="sample.currency.MoneyUtilsTest"> <test-method name="testAddMoney"> <test-data connection="SAMPLE_CONNECTION" dataset="dataset/money.xml"/> <test-scenario id="DEFAULT_SCENARIO"> <input-data name="m1" value="input/sample/currency/MoneyUtils/addMoney_DEFAULT_SCENARIO_m1.xml"/> <input-data name="m2" value="input/sample/currency/MoneyUtils/addMoney_DEFAULT_SCENARIO_m2.xml"/> <output-data name="m3" value="output/sample/currency/MoneyUtils/addMoney_DEFAULT_SCENARIO_m3.xml"/> </test-scenario> <test-scenario id="INVALID_MONEY"> <input-data name="m1" value="input/sample/currency/MoneyUtils/addMoney_INVALID_MONEY_m1.xml"/> <input-data name="m2" value="input/sample/currency/MoneyUtils/addMoney_INVALID_MONEY_m2.xml"/> <exception-data name="Exception" type="java.lang.IllegalArgumentException"/> </test-scenario> <test-type> <testtypeid>TIMED</testtypeid> <perfparams> <maxelapsedtime>40</maxelapsedtime> </perfparams> </test-type> </test-method> </test-case> </test-suite> In the configuration XML, we specify the test-case name and test-method name. Against each test method, we specify multiple scenarios. A test is added in the suite for every scenario of every method. Additionally, if we specify test types as TIMED or LOAD, then, for every scenario of each method, the created test case will be a TIMED test. That is, the test checks if the method completes in 40 milliseconds and ensures the test’s functional correctness. For every suite of unit tests, we must create one configuration XML. This XML file is named TestConfig.xml and should be in the classpath. For the above configuration, two timed tests will be created.The Zing framework’s zing.config.TestCreator class reads the configuration and creates an array of tests per the configuration at runtime for every scenario of every method of each test type. Each test is uniquely identified by ::— for example, testAddMoney:DEFAULT_SCENARIO:TIMED. The configuration is stored as attributes of each test type in the framework and exposed as an API by ITest. The test creator also reads the input and output XML files and creates an input and output map. Note the location of input and output files. A convention is followed to store the XML files, allowing the test creator to generically find those XML files in the classpath.The Zing framework’s zing.tests.Suite assembles all the tests in a test suite. The Suite class also runs the test cases. Refer to zing.config.TestCreator, zing.config.PerformanceDecorator, and zing.tests.Suite for more details. Database dataUnit test cases should preferably work with the same set of data to achieve automation (especially during continuous integration) and consistency in results. However, achieving consistency in results becomes complex to tackle if the method we want to test performs database transactions. DbUnit resolves such issues. If we use DbUnit, we must write code to locate the dataset and run the dataset against the database as part of test initialization. But doing so creates dependency of unit test code on the data.To resolve the dependency issue, we configure the dataset names in TestConfig.xml for every test method. We use the DbUnit-based syntax for dataset files. Zing’s TestTypes overrides the setup() method, which checks if any datasets are to be run. If yes, then test types will create a connection using the database settings and trigger the dataset against that database. This makes it possible to pull up the DbUnit-related configuration in the Zing framework. Developers just have to create the datasets and the framework handles the rest. setup() looks like: protected void setUp() throws Exception{ TestHelper.initializeTestCase( this ); if( null != testCaseConfig.getDataSets() ){ DatabaseHelper.refreshDatabase( testCaseConfig.getDataSets() ); } } We also follow a certain naming convention for the dataset location and its name, allowing us to look up the datasets in the classpath and trigger them generically. AssertionsTo write generic test cases, we must extract expected results and provide automatic assertions. We tackle that task by abstracting the output object’s data in an XML file (similar to input objects). During creation of the test suite, the Suite class along with TestCreator ensures that a map containing expected objects is set in the test case. After invoking the functional method, the test method uses an AssertHelper class to assert the actual output against the output in the output map. We need AssertHelper for two purposes:Asserting collections (both ordered and unordered)Asserting exceptions (sometimes an exception means a successful test)Refer to the next section for doing assertions and to the zing.config.AssertHelper class.A generic test caseUsing the above techniques, we achieve complete data abstraction from the unit tests. An example of a test case is: package sample.currency; import junit.framework.*; import zing.tests.*; public class MoneyUtilsTest extends SimpleTest{ public static Test suite(){ return ITest.TestHelper.createSuite(MoneyUtilsTest.class.getName()); } public void testAddMoney() throws Exception{ try{ MoneyUtils moneyUtils = new MoneyUtils(); Money money = moneyUtils.addMoney((Money)getInputData("1"),(Money)getInputData("2")); assertOutput("1",money); } catch (Exception e){ assertOutput("exception",e); } } } We could modify the framework so we don’t even have to write the test cases for simple scenarios, but that subject reaches beyond this article’s scope.Infrastructure servicesInfrastructure services include services such as logging, JNDI lookup, caching, property reading, and XML binding. Instead of writing a piece of code repeatedly for different applications, abstract this code as a set of reusable services that can be used with any application. For logging, use Jakarta Commons Logging as it hardly needs any configuration and is a de facto standard nowadays. For JNDI lookup, write a generic ServiceLocator. For property reading, write an environment-independent property reader. These reusable services should be configurable and should be developed outside the Zing framework so they can be used by actual applications as well.JUnitEE and Cactus for server-side testingJUnitEE provides browser-based testing and Cactus provides in-container testing. JUnitEE has no special requirements for use, but, for Cactus, we inherit tests from its base classes. Hence, for Cactus integration, the framework provides classes such as zing.tests.ServletTest, FilterTest, and EJBTest that inherit from Cactus classes to enable them for server-side testing, but, at the same time, expose the similar structure as SimpleTest. The integration is achieved because these test cases also implement the zing.tests.ITest interface and hide the Cactus-related functionality underneath. Developers just need to be aware that the test cases they write require server resources and extend their test cases from ServletTest or FilterTest. Other items such as configuration and input/output objects remain the same. Running server-side tests differs slightly from running normal unit tests. The test cases and resources must be packaged into a war file and deployed in a Web container. Cactus and JUnitEE provides Ant tasks for creating the war files. Even automatic deployment is possible using Ant. Refer to Cactus and JUnitEE sites to get more info on creating the WAR and running test cases.The testing framework discussed in this article does integrate seamlessly with Cactus and JUnitEE. Zing also provides independence from the environment in which these tests are run. So the developer just needs to concentrate on writing functional tests and not bother with specifics related to server-side or local testing.Custom toolsWe use XStream with Zing to serialize objects to XML. And though the XML representation of objects using XStream is readable and quite easy to write, generating the XML data with default values for the test cases is prudent. The utility class zing.utils.DataGenerator generates this XML data. It operates in two modes: Batch mode: You specify the folder containing the test-case source files. For each test case, the DataGenerator then automatically identifies the class and methods being tested. In addition, for each method, it identifies the method arguments and method return type, and creates XML for both in the input and output folders, respectively. The generated XML contains default values for all the attributes, and the generated files follow the naming conventions used for input/output XML. Developers now just have to fill in the blanks.Class mode: You specify the class name for what XML is to be generated and the path where it must be generated. Again, default values for attributes are used.The utility class handles most of the Java language datatypes, collections, and also cyclic dependencies, which proves handy where the object graph is intricate.We also use datasets with DbUnit. Again, these datasets are XML that contain data that must be “refreshed” in the database before the test runs. If the data is already available in the database, exporting the data from the database to these datasets is a good option. The utility class DataExported does this for you.How does it work? Well, it needs information such as how to connect to the database, the list of SQL queries, whose results are stored as datasets, and the path/filename where the dataset needs to be created. This information can be specified through an XML file, whose path is passed as an argument to the utility class (or the DataExporter will automatically load an XML file named DataExportConfig.xml from the classpath). The XML appears as follows: <datasets> <connection> <driver>oracle.jdbc.driver.OracleDriver</driver> <url>jdbc:oracle:thin:@itl-hw-46601a:1521:FIXAPP2</url> <username>test</username> <password>test</password> </connection> <dataset> <path>D:/Customer.xml</path> <sql> <table class="legacyTable">ADDRESS</table> <query>SELECT * FROM ADDRESS WHERE CUSTOMERID = 20000</query> </sql> <sql> <table class="legacyTable">CUSTOMER</table> <query>SELECT * FROM CUSTOMER WHERE CUSTOMERID = 20000</query> </sql> </dataset> </datasets> Continuous integrationCruiseControl for continuous integration should be used as it’s a de facto standard in the Java community. During continuous integration, unit tests should be built and run, and reports should be generated. JUnit reports are the best way to analyze how correct the code is functionally. Coverage tools such as jcoverage integrate well with unit testing. Jcoverage instruments the code after compilation and then, when the unit test runs, displays the code’s percentage of coverage. JUnit and jcoverage reports are like progress reports for the build phase. They correctly point out the build progress and the quality of the unit tests. The Zing framework integrates seamlessly with jcoverage, Ant, and Maven. Continuous integration should be enabled in the project from day one of the build phase. It helps in discovering issues with the build phase early on. Managers should pay special attention to these reports, which should be published at a central location so everyone receives an overview of the progress.ConclusionZing has the following advantages:Eases unit testing; developers don’t need to know details of multiple testing frameworksMakes unit tests configurableHelps developers write generic test cases, thus reducing maintenanceThe Zing framework provides a comprehensive framework for doing simple Java or Java Enterprise System-based unit testing, offering everything developers need in one package. It tackles the issues faced during unit testing.Most of the quality open source tools available for testing are integrated with Zing. And integrating other testing frameworks with this framework is easy. Zing in no way intrudes upon other test frameworks that are used, and hence, will not be affected by future upgrades. In addition, developers can use various features of the integrated frameworks for advance testing. Download the code that accompanies this article to get the sample implementation of the concept.The concept of data scenarios should encourage and enforce developers to do extensive unit testing for different scenarios, making the test-run result more reliable. However, we feel this framework can be taken one step further, where developers won’t need to write test cases for simple test scenarios. The framework should be able to generate test cases on the fly. This article provides a good starting point to move towards that goal.Abhijeet Kesarkar works for Infosys Technologies in Pune, India. Kesarkar has expertise in various Java Enterprise System open source technologies and is a staunch supporter of test-driven development. Tanmay Ambre works for Infosys Technologies in Pune, India. Ambre’s areas of expertise include Java/Java Enterprise System technologies, with more than five years of experience with the WebLogic suite of products. He also specializes in tools usage (especially open source) in Java/JEE platforms. App TestingSecurityJava