warpedjavaguy

Imperative by day and functional by night

Test Driven Rewrites (for Programmers)


Why write mock objects manually when you can emulate them virtually?
 

In my previous post on Test Driven Rewrites I described at a high level how I used virtual mock objects to deliver a regression test suite for an existing component and how I used that suite to test and develop a rewrite of that component. In it you’ll recall that the existing codebase had no test suite at all and that I had very little knowledge of the business domain. I had to come up with a quick and easy way to create a test suite from scratch. Here I present some code that shows how I did it using AspectJ, XML, and JUnit.

The first thing I had to do was identify the following points in the codebase:

  1. The test point
    • This is the point in the application where the call to the component is made. It is the call to the component method that will be put under test.
  2. All the mock points
    • These are all the points in the component where calls are made to other components and/or services. They are the calls to the methods that need to be mocked.

Next, a TestCaseGenerator aspect was written to capture the arguments and return values of the test and mock points identified above. This aspect defined two point cuts for matching the test and mock points respectively. “Around” advice was used on each to capture and persist the arguments and return values of each call to an XML file. A MethodCall POJO was created to encapsulate the name, arguments, and return values of individual method calls and Castor XML was used to marshal it to XML. The aspect code is shown below:

public aspect TestCaseGenerator {

    pointcut testPoint() :
        execution (/* test-point method pattern */);

    pointcut mockPoints() :
        call (/* mock-points method pattern */)
        && withincode(/* test-point method pattern*/);

    // output XML document
    private Document xmldoc;

    Object around() : testPoint() {

        /* instantiate new XML document here */

        // encapsulate invoked method data into POJO
        MethodCall methodCall = new MethodCall();
        methodCall.setMethod(
            thisJoinPoint.getSignature().toString());
        methodCall.setArgs(thisJoinPoint.getArgs());
        Object retValue = proceed();
        methodCall.setRetValue(retValue);

        /* marshal POJO into test-point XML element here */

        /* persist XML to file here */

        // pass back return value
        return retValue;
    }

    Object around() : mockPoints() {

        // encapsulate invoked method data into POJO
        MethodCall methodCall = new MethodCall();
        methodCall.setMethod(
            thisJoinPoint.getSignature().toString());
        methodCall.setArgs(thisJoinPoint.getArgs());
        Object retValue = proceed();
        methodCall.setRetValue(retValue);

        /* marshal POJO into mock-point XML element here */

        // pass back return value
        return retValue;
    }
}

This aspect was woven into the existing codebase which was then deployed locally. So now an XML test case file was automatically generated by the aspect on the file system for every scenario that was executed on the deployed application. The business identified all the scenarios that needed to be tested and a test case XML file was generated for each one in real time as it was executed on the system by a user. The basic structure of the XML that was generated is shown below:

<?xml version="1.0" encoding="UTF-8">
<test-case>
    <test-point method="method pattern">
        <arguments>
            ...
        </arguments>
        <return-value>
            ...
        </return-value>
    </test-point>
    <mock-point method="method pattern">
        <arguments>
            ...
        </arguments>
    </mock-point>
    <mock-point method="method pattern">
        <arguments>
            ...
        </arguments>
        <return-value>
            ...
        </return-value>
    </mock-point>
    ...
</test-case>

Once the data for all the known use case scenarios had been captured in XML files, it was then time to start writing the JUnit test suite that would execute the tests. This involved creating the following:

  1. A MockedTestCase class (extends JUnit TestCase)
  2. A MockedTestSuite class (extends JUnit TestCase)
  3. A VirtualMocker aspect

The MockedTestCase class was implemented with a constructor that accepted a given test case XML file. All the mock point data in the file was loaded into individual MethodCall POJO instances and stored in a Map keyed by the method signature pattern. The map was bound to a thread local variable to make it thread safe and accessible to the virtual mocker aspect for mocking purposes. The test point data was used to invoke the component in the test and to assert the returned result. The Java code is shown below:

public class MockedTestCase extends TestCase {

    public static final
        ThreadLocal<Map<String,MethodCall>> MOCKS =
           new ThreadLocal<Map<String,MethodCall>>();

    private File xmlFile;
    private MethodCall testPoint;

    public MockedTestCase(File xmlFile) {
        super("testIt");
        this.xmlFile = xmlFile;
    }

    // overrides super to return test case XML file name
    public String getName() {
        return xmlFile.getName();
    }

    protected void setUp() throws Exception {

        testPoint = new MethodCall();
        /* Unmarshal XML test-point into POJO */

        Map<String,MethodCall> mocks =
            new HashMap<String,MethodCall>();
        /* Unmarshal XML mock-points and load into Map */

        // store loaded mocks in thread local var
        MOCKS.set(mocks);
    }

    public void testIt() throws Exception {

        // invoke the test
        // - pass in test-point args and capture the result
        Object args = testPoint.getArgs();
        Object result = /* invocation goes here */;

        // assert the result
        assertValues(
            "Unexpected result returned by "
                + testPoint.getMethod(),
            testPoint.getRetValue(),
            result);
    }

    public static void assertValues(
        String msg, Object expected, Object actual) {
        /* Compare the two values here and throw assertion
            error if not the same. One generic way of doing
            this might involve marshaling both objects to XML
            and then comparing the two with an XML diff utility. */
    }
}

The MockedTestSuite class was implemented to load and run every test case. This was done by loading all the XML test case files from the file system, instantiating individual MockedTestCase instances for each one, and adding them all to the suite of tests to be executed.

The VirtualMocker aspect was defined to intercept all the mock-points using an “around” advice which asserted the incoming arguments against the mocked arguments for the call before returning the mocked return value. Access to all mock point data was provided in a Map contained in the thread local variable as loaded by the currently executing MockedTestCase. The aspect code is shown below:

public aspect VirtualMocker {

    pointcut mockPoints() :
        call (/* mock-points method pattern */)
        && withincode(/*test-point method pattern*/);

    Object around() : mockPoints() {

        // retrieve mock data
        Map<String,MethodCall> mocks =
            MockedTestCase.MOCKS.get();
        MethodCall mock = mocks.get(
            thisJoinPoint.getSignature().toString());

        // assert input and return output
        MockedTestCase.assertValues(
            "Unexpected input arguments passed to "
                + mock.getMethod(),
            mock.getArgs(),
            thisJoinPoint.getArgs());
        return mock.getRetValue();
    }
}

The VirtualMocker aspect was woven into the existing codebase and the MockedTestSuite was executed over it to perform all the tests. The same aspect was then woven into the new codebase and used to facilitate the test driven development of the rewrite.

Each XML test case file contained all the mock data that was required to test the same scenario that generated it. The VirtualMocker aspect used the data to do all the mocking in all the tests. No mock objects or test data fixtures were manually coded 🙂

Advertisements

Written by warpedjavaguy

December 20, 2007 at 9:54 pm

Posted in java, programming

Tagged with

2 Responses

Subscribe to comments with RSS.

  1. Well done !

    I had developed a similiar aproach some time agou but using concrete mock objects instead of serialized ones. Instead of intercepting method calls my solution has intercepted creation of concrete object instances.

    Could you please explain a little more in details how you handle complex arguments and return types? As far as it is visible from your code you can just handle simple types, arn’t you?

    Maciej Andreas Bednarz

    December 22, 2007 at 7:02 pm

  2. Hi Maciej,

    The object models I was handling were actually complex and deep and required me to define some very sophisticated XML mappings. They consisted of many lists, maps, arrays, and objects at various levels. Some of the objects had indirect relationships with other objects scattered throughout the model. I had to write specific handlers to ensure that these relationships were maintained during the marshaling and unmarshaling processes. Castor was very accommodating in all regards and made it relatively easy to do.

    WarpedJavaGuy

    December 22, 2007 at 8:42 pm


Comments are closed.

%d bloggers like this: