warpedjavaguy

Imperative by day and functional by night

Test Driven Rewrites


Rewritten applications are best tested in the most laziest ways possible.
 

Rewriting existing applications or components can be a risky exercise. Consider an existing component that provides critical business functionality to an enterprise. The component has been tweaked and fine tuned over the years and has evolved to provide the exact functionality required by the business. It is a core component that consists of very fine grained and delicate operations and it is used by multiple applications. Although it functions correctly, a rewrite has been requested after a review identified some major performance and maintainability issues. The functions were taking too long to execute and it was becoming too difficult to add, modify, or replace any functionality without incurring negative side effects. The component had become very volatile and susceptible to change.

I was part of a team that was given the task of rewriting such a component. My role was to come up with an automated regression test suite that would test both the old and the new software. The first problem I had was that there was no current test suite at all that I could use as a base. I was not familiar with the software at all and my general understanding of the business and its domain was very poor. I was a newbie that had just joined the project and I was thrown into the deep end. I had to quickly find a way to start creating tests that would cover all possible scenarios. In my mind I was thinking “I wish I had a test generator”. That way I could then get a business person to run through all the scenarios in the application and have the tests automatically generated on the file system (including mock data, assertions, and all). The same tests could then be used to test the rewrite. So I set upon creating a test generator. It wasn’t going to be easy but I knew that once I had it going that the remaining exercise of running the same tests over the rewritten code base would be a breeze. It was a challenge that I could not refuse and failure was not an option.

I immediately started researching mocking frameworks and was looking for one that would involve minimal overhead from a coding perspective. I wanted one that would help generate all the mock objects and data that I would need for all my tests. The last thing I wanted to do was to have to manually start coding mock objects and data fixtures. I wanted all the data to be automatically captured and all the tests to be automatically generated. After about half an hour of researching online, I stumbled upon this beauty:

Virtual Mock Objects using AspectJ with JUNIT

Using aspect-oriented programming in AspectJ to facilitate isolation of testable units, without hand-crafting Mock Objects or using a specialized Mock Object generation tool.

With aspects you can intercept method invocations and access the input parameters and return values both before and after the call. This was exactly what I needed. The existing component we were replacing made several calls to a rules engine and other EJB services. I immediately realised that I could write an aspect to intercept those calls and capture the data going in and the data coming out. I could then persist this data to an XML file and have all the mock data I need for all my tests. So I wrote a test generator aspect that captured all the data going in and out of the calls that I needed to mock. I separated the input and output data of each method call into separate XML elements in the generated file and associated them with the call. I weaved the aspect into the code and asked a business person to use the application as they normally do and run some scenarios. For each scenario that they ran an XML test file was automatically generated in real time on the file system. The existing code was not changed at all. The generator aspect was simply woven into it.

The test case generator was complete and all test cases were identified by the business and generated in less than two weeks. We managed to generate some 600+ test cases in that time. It was now time to start writing the JUnit test runner and virtual mocker aspect using the pattern described in the virtual mock objects article (quoted above). The mocker aspect was written to intercept every method invocation for which mock data was provided in the generated XML test files. It asserted the input parameters passed to each mocked call and overrode the method to return the output for that call as captured in the XML test file. The aspect was weaved into the existing code and a JUnit test runner was written to invoke the component using the captured test input and to assert the returned result against the captured test output. The aspect handled all the mocked calls in between. Again, the existing code was not modified in anyway and the virtual mocker aspect was simply woven into it. As expected all the tests passed when executed over the existing code base. I now had a complete regression test suite that I could use to test both the existing and newly rewritten component that the rest of the team was busy developing. I had some time on my hands and helped them complete it.

When the rewrite was ready for testing I weaved the same virtual mocker aspect into the new code and ran the same tests using the same JUnit test runner. The tests all ran very efficiently and fast and about 60% of them passed first go! The other 40% that failed were due to various bugs that were introduced into the new code, missing logic, and other miscellaneous anomalies. It took less than a week to fix the new code and achieve a 100% test pass rate. That 40% of the rewrite was completely test driven. The entire exercise was a huge success and the tests were integrated into the automated build process. The entire test suite was recorded in XML form and it could be used to successfully verify both the old and rewritten component. The rewritten component continued to evolve and the tests were kept in sync. Whenever the data model changed the tests were also updated. XSL stylesheets made it easy to transform and restructure the data contained in all XML test files as required.

When it came to running the tests in isolation though, empty stub implementations of all EJB interfaces had to be used. Those empty stubs were the only little bit of manual coding we had to do. But as far as mocking goes, nothing was manually coded. Using aspects in this way made it possible to test both existing and rewritten code “as is” without having to manually write any mock objects or test data fixtures 🙂
 
AddThis Social Bookmark Button AddThis Feed Button

Advertisements

Written by warpedjavaguy

December 11, 2007 at 10:01 pm

Posted in java

Tagged with , ,

6 Responses

Subscribe to comments with RSS.

  1. I’ve been reading about Virtual Mocks, it seems to be an old technique, replaced by modern mock frameworks (jMock, easyMock). The way you use it though, seems to be a great idea.

    What I would like to do is to TDD-TopDown-IntegrationTest-Design my application, and then use a similar approach to transform (or at least reuse some code) my integration tests into unit tests.

    I don’t know, maybe this is just a crazy idea. It would be nice to see some code/tutorial of you’re approach for inspiration 🙂

    Daniel

    December 12, 2007 at 11:03 pm

  2. Hi Daniel,

    Yes the virtual mocks article is definitely a bit of an oldie but a real goodie.

    I believe that what you are trying to achieve is surely possible. If you already have the integration tests in place, then you should be able to use them to drive the generation of your unit tests in a similar manner to what I have done here. You’d want to make sure you don’t end up with duplicate tests though. As long as each integration test yields unique scenario’s you should be right. But even if they don’t I would imagine that it would not be too difficult to filter out the duplicates.

    I am planning a follow up post soon that will delve into the details of exactly how I did it. My fellow peers have requested that I do that too.
    So stay tuned !!!

    WarpedJavaGuy – December 20, 2007

    FYI – I have posted the follow up (for programmers) that delves into the details and presents some code too.

    WarpedJavaGuy

    December 12, 2007 at 11:38 pm

  3. One issue though with this method (and it occurred in this case) is that aspect based code can be difficult to “see” and difficult for some people to grasp exactly where / how it’s working. In some ways if the code can’t be easily understood by the majority of people on a team then it may lose it’s value.

    While I do think that most developers should be comfortable with aspects, this is sadly not always the case. So it’s vital for the code to be clear in it’s intention and ensure there is some documentation on the process to update the tests. Think of it this way: if a relatively competent pilot can’t figure out how to fly a new plane, it doesn’t matter how great the features of the plane are, it just won’t get flown.

    Maintainability_Guy

    December 14, 2007 at 4:02 pm

  4. Maintainability_Guy, hey I like the name 😉

    This was actually my first experience with aspects and I have to say I found them very easy to learn. It is also the only time I have ever used them. I found them to be very suitable in this case because they allowed me to weave my test generator and data mocking code into the existing code base in a declarative fashion. More importantly though, they allowed me to test the code “as is” in any environment without having to write any mock objects.

    I agree with you that most developers really should be comfortable with aspects. There really is very little terminology to learn. Aspect plugins can make working with aspects a lot easier in IDE’s. They provide visual cues in the code that highlight all the interception points (point cuts in AOP terminology). Instead of instantly riding off AOP at the mere mention of aspects it would be good if all developers could spend just one day learning the basics. That’s all it takes really. And if they’re already familiar with interceptors and DI, then half a day.

    As a side note: the test generator and virtual mocker aspects mentioned in this post were not woven into any production code. They were only woven into the development and build environments for the purposes of testing.

    WarpedJavaGuy

    December 14, 2007 at 8:57 pm

  5. Hi, did you consider using Agitar for this project ? If not why did you disgard this possibility ?

    opensourcereader

    December 20, 2007 at 9:02 pm

  6. I don’t think that Agitar can generate tests for ‘real’ business scenarios. I think it is more suited for rigorous unit testing purposes where less meaningful input data is used in multiple attempts to try and ‘break’ the code instead of test its logic. I was a bit agitated myself the last time I used it 😉

    The approach i used here worked well and was very successful for the project I was working on. There most likely are other alternative ways of doing the same thing. But when I was researching them, virtual mock objects seemed to provide the most ideal and easiest solution. And in this particular instance they proved very easy.

    My follow up post (for programmers) shows exactly how I did it.

    WarpedJavaGuy

    December 20, 2007 at 9:52 pm


Comments are closed.

%d bloggers like this: