Skip to main content

Did you know about Japex?

Posted by spericas on October 20, 2005 at 9:56 AM PDT

New blog, new life. Let's start things up by talking about a nifty tool we have developed internally and are making available as part of the FastInfoset project (and will mostly likely create a new project for very soon). You can browse the CVS tree here.

Japex is a simple yet powerful tool to write Java-based micro-benchmarks. It is similar in spirit to JUnit in that if factors out most of the repetitive programming logic that is necessary to write in micro-benchmarks. This logic includes loading and initializing multiple drivers, warming up the VM, timing the inner loop, etc.

Japex produces some beautiful HTML reports (and also some less visually attractive XML ones :). Here is a sample. I started this project over a year ago with a primary goal to test Fast Infoset performance; over time, it evolved into a rather sophisticated tool which we are now using in multiple projects to write micro-benchmarks.

So, what does it take to write a micro-benchmark? Basically two things: an XML config file and one or more Japex drivers. Let's start with the drivers. A Japex driver is a class that extends JapexDriverBase which in turn implements:

public interface JapexDriver extends Runnable {
        public void initializeDriver();
        public void prepare(TestCase testCase);   
        public void warmup(TestCase testCase);
        public void run(TestCase testCase);
        public void finish(TestCase testCase);     
        public void terminateDriver();

Each method in the interface above defines a "phase" of the benchmark. The methods initializeDriver() and terminateDriver() are used to initialize and terminate driver state, and are called a single time per driver. The method prepare(TestCase) is called to prepare a test before running, and the methods warmup(TestCase), run(TestCase) and finish(TestCase) are called to, respectively, warmup the VM, run the test (and clock the performance!) and finish the test by producing a result.

Ok, so what about those config files then? Here is a sample:

  <testSuite name="ParsingPerformance" xmlns="">
        <param name="libraryDir" value="lib"/>
        <param name="japex.classPath" value="${libraryDir}/../dist/classes"/>
        <param name="japex.warmupTime" value="10"/>
        <param name="japex.runTime" value="10"/>
        <driver name="XDriver">
            <param name="Description" value="Driver for X parser"/>
            <param name="japex.DriverClass" value=""/>
        <driver name="YDriver">
            <param name="Description" value="Driver for Y parser"/>
            <param name="japex.driverClass" value=""/>
        <driver name="ZDriver">
            <param name="Description" value="Driver for Z parser"/>
            <param name="japex.driverClass" value=""/>
        <testCase name="file1.xml">
            <param name="xmlfile" value="data/file1.xml"/>
        <testCase name="file2.xml">
             <param name="xmlfile" value="data/file2.xml"/>

As you can see, a config file basically specifies some global params, a list of drivers and a list of test cases. Every driver will be run against every test case and a result will be computed.

I hope this short intro sparked your interest on this tool. For further information, you should check the still-under-developement manual. You can also check some sample micro-benchmarks here.

Finally, and perhaps the most important thing to remember from this blog, Japex is a community project! So, we welcome your feedback and participation. We also have tons of new ideas on how to improve Japex even more. I'll share some of these new ideas as well as some of the more advanced features already implemented in future blogs. Stay tuned.

Related Topics >>