Sunday, April 21, 2013

Fault injection in your JUnit with ByteMan

This article shows how to use ByteMan from JBoss to inject fault when testing Java code.

As usual, let us start with some trivial code to test: a simple class that appends strings to a given file.
package org.javabenchmark.byteman;

import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.logging.Level;
import java.util.logging.Logger;

/**
 * Logging helper class.
 *
 */
public class LogHelper {

    public static final String LOG_FILE_PROPERTY = "logHelperFile";
    public static final String DEFAULT_LOG_FILE = "sample.log";
    private static final File LOG_FILE = new File(System.getProperty(LOG_FILE_PROPERTY, DEFAULT_LOG_FILE));

    /**
     * logs the given trace to a dedicated file.
     *
     * @param trace a string to be written into the log file.
     */
    public static void log(String trace) {

        try {
            FileOutputStream fos = new FileOutputStream(LOG_FILE, true);
            fos.write(trace.getBytes());
            fos.write('\n');
            fos.flush();
            fos.close();
        } catch (IOException ex) {
            final String msg = "Log Helper can't write trace to the log file: " + LOG_FILE.getAbsolutePath();
            System.err.println(msg);
        }
    }
}


The next step is to write the corresponding test:
package org.javabenchmark.byteman;

import java.io.File;
import static org.fest.assertions.api.Assertions.*;
import org.jboss.byteman.contrib.bmunit.BMRule;
import org.jboss.byteman.contrib.bmunit.BMUnitRunner;
import org.junit.Test;
import org.junit.runner.RunWith;

/**
 * Unit Test dedicated to LogHelper.
 */
public class LogHelperTest {

    private static final String SUCCESSFULL_TEST_LABEL = "Successfull Test";

    private void deleteFile(File file) {
        // clean
        file.delete();
        assertThat(file).doesNotExist();
    }
    
    @Test
    public void shouldLog() {

        File logFile = new File(LogHelper.DEFAULT_LOG_FILE);

        // clean
        deleteFile(logFile);

        // one line test
        LogHelper.log(SUCCESSFULL_TEST_LABEL);

        // control
        assertThat(logFile).exists();
        assertThat(logFile).hasContent(SUCCESSFULL_TEST_LABEL);

        // several lines test
        LogHelper.log(SUCCESSFULL_TEST_LABEL);
        LogHelper.log(SUCCESSFULL_TEST_LABEL);

        // control
        assertThat(logFile).hasContent(SUCCESSFULL_TEST_LABEL + '\n' + SUCCESSFULL_TEST_LABEL + '\n' + SUCCESSFULL_TEST_LABEL);
        deleteFile(logFile);
    }
}

Running the test produces the following output:
Testsuite: org.javabenchmark.byteman.LogHelperTest
Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.136 sec

And if you enable code coverage in your favorite IDE, you should have something like this:

Red regions show us parts of the code that are not covered by the test. In our case, we can see that the code that handles IO exception is never invoked. Then, it is not possible to cover all the code of the method without raising an IOException. Hopefully, there is a powerful tool for this: ByteMan.

As explained by the authors:
Byteman is a tool which simplifies tracing and testing of Java programs. Byteman allows you to insert extra Java code into your application, either as it is loaded during JVM startup or even after it has already started running.

Actually, ByteMan is a java agent that does instrumentation and we are going to use it to inject IO exception during the execution of our test to check the code that deals with it. The idea is to throw an IOException when the method FileOutputStream.write(byte[]) is called.

Firstly, the dependency for ByteMan:

  org.jboss.byteman
  byteman-bmunit
  2.1.2


Secondly, modify the previous test to add ByteMan to our JUnit and write a new method to test the code that handles IOException:
package org.javabenchmark.byteman;

import java.io.File;
import static org.fest.assertions.api.Assertions.*;
import org.jboss.byteman.contrib.bmunit.BMRule;
import org.jboss.byteman.contrib.bmunit.BMUnitRunner;
import org.junit.Test;
import org.junit.runner.RunWith;

/**
 * Unit Test dedicated to LogHelper.
 */
@RunWith(BMUnitRunner.class)
public class LogHelperTest {

    private static final String SUCCESSFULL_TEST_LABEL = "Successfull Test";

    private void deleteFile(File file) {
        // ...
    }
    
    @Test
    public void shouldLog() {
        // ...
    }

    @Test
    @BMRule(name = "IOException Rule",
            targetClass = "FileOutputStream",
            targetMethod = "write(byte[])",
            targetLocation = "AT ENTRY",
            condition = "TRUE",
            action = "throw new java.io.IOException()")
    public void shouldHandleIOException() {
        
        File logFile = new File(LogHelper.DEFAULT_LOG_FILE);

        // clean
        deleteFile(logFile);

        // one line test that should raise an io exception
        LogHelper.log(SUCCESSFULL_TEST_LABEL);
        
        // control
        assertThat(logFile).exists();
        assertThat(logFile).hasContent("");
        
        // clean
        deleteFile(logFile);
    }
}


The BMRule annotation defines the execution context for ByteMan:
  • name: to set the name of the rule
  • targetClass and targetMethod: to set the method that must be instrumented
  • targetLocation: to set when to inject  the code (at the beginning, at the end or somewhere in the body of the method)
  • condition: to set some condition to be met
  • action: the code to be injected
So, we have a rule named "IOException Rule" that is activated when the method write(byte[]) of the FileOutputStream class begins. The rule then injects a given code that throws an IOException. When the shouldHandleIOException test runs, every time that write(byte[]) is called then a java.io.IOException is raised.

Now, running all tests produces:
Testsuite: org.javabenchmark.byteman.LogHelperTest
byteman jar is /home/javabenchmark/.m2/repository/org/jboss/byteman/byteman/2.1.2/byteman-2.1.2.jar
Log Helper can't write trace to the log file: /home/javabenchmark/Documents/projects/javabenchmark/sample.log
Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.534 sec

And the code coverage looks like this:

The log(String) method is now fully covered (the implicit constructor of the LogHelper class is not tested, explaining the missing percents to reach 100%).

Conclusion

ByteMan is very useful to test code that deals with Exception, especially fault tolerant one. That way, you can check that your fault tolerance strategy is really resilient, or that your code gives you relevant traces when errors occur.

Thursday, April 11, 2013

Check that your code is thread-safe with JUnit and ContiPerf

This short article shows how to check that you write thread-safe code by transforming a JUnit test into a concurrent one with ContiPerf.

Suppose that you have to test a date formatter component that uses the famous SimpleDateFormat class, as illustrated below:
package org.javabenchmark;

import java.text.SimpleDateFormat;
import java.util.Date;

/**
 * Helper dedicated to format date in a standard way.
 */
public class NonThreadSafeDateFormatHelper {

    /**
     * the date format for standard representation.
     */
    private SimpleDateFormat standardDateFormat = new SimpleDateFormat("yyyy-MM-dd");

    /**
     * formats the given date using the standard date format: yyyy-MM-dd.
     *
     * @param date the date to format
     * @return a literal representation of the given date.
     */
    public String toStandardString(Date date) {
        return standardDateFormat.format(date);
    }
}

Then, you could write a junit test like this:
package org.javabenchmark;

import org.junit.Test;

import java.util.Calendar;

import static org.fest.assertions.api.Assertions.*;

public class NonThreadSafeDateFormatHelperTest {

    @Test
    public void shouldFormatRandomDate() {

        // random date
        Calendar calendar = Calendar.getInstance();
        calendar.set(Calendar.YEAR, (int) (1000 + Math.random() * 1000));
        calendar.set(Calendar.DAY_OF_YEAR, (int) (Math.random() * 365));

        // test
        NonThreadSafeDateFormatHelper dateFormatHelperToTest = new NonThreadSafeDateFormatHelper();
        String randomDateString = dateFormatHelperToTest.toStandardString(calendar.getTime());

        // general controls
        assertThat(randomDateString).isNotNull();
        assertThat(randomDateString).hasSize(10);
        // year control
        String literalYear = String.valueOf(calendar.get(Calendar.YEAR));
        assertThat(literalYear).isEqualTo(randomDateString.substring(0, 4));
        // month control
        String literalMonth = String.valueOf(calendar.get(Calendar.MONTH) + 1);
        if (literalMonth.length() == 1) {
            literalMonth = "0" + literalMonth;
        }
        assertThat(literalMonth).isEqualTo(randomDateString.substring(5, 7));
        // day control
        String literalDayh = String.valueOf(calendar.get(Calendar.DAY_OF_MONTH));
        if (literalDayh.length() == 1) {
            literalDayh = "0" + literalDayh;
        }
        assertThat(literalDayh).isEqualTo(randomDateString.substring(8));
    }
}

Next, running the test will produce the following output:
Testsuite: org.javabenchmark.NonThreadSafeDateFormatHelperTest
Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.173 sec

The test passes, and you could think that everything is fine, but what will happen if the DateFormatHelper class is used in a concurrent way, for instance from a JSF page to display the current date ?

To check that your code can handle concurrency, you can modify the previous JUnit test like this:
package org.javabenchmark;

import org.junit.Rule;
import org.junit.Test;

import java.text.ParseException;
import java.util.Calendar;
import org.databene.contiperf.PerfTest;
import org.databene.contiperf.junit.ContiPerfRule;

import static org.fest.assertions.api.Assertions.assertThat;

public class NonThreadSafeDateFormatHelperPerfTest {

    @Rule
    public ContiPerfRule i = new ContiPerfRule();
    /**
     * the date format helper to test.
     */
    private NonThreadSafeDateFormatHelper dateFormatHelperToTest = new NonThreadSafeDateFormatHelper();

    @Test
    @PerfTest(invocations = 1000, threads = 2)
    public void shouldFormatRandomDatesConcurrently() throws ParseException {

        // random date
        Calendar calendar = Calendar.getInstance();
        calendar.set(Calendar.YEAR, (int) (1000 + Math.random() * 1013));
        calendar.set(Calendar.DAY_OF_YEAR, (int) (Math.random() * 365));
        
        // test
        String randomDateString = dateFormatHelperToTest.toStandardString(calendar.getTime());

        // general controls
        assertThat(randomDateString).isNotNull();
        assertThat(randomDateString).hasSize(10);
        // year control
        String literalYear = String.valueOf(calendar.get(Calendar.YEAR));
        assertThat(literalYear).isEqualTo(randomDateString.substring(0, 4));
        // month control
        String literalMonth = String.valueOf(calendar.get(Calendar.MONTH) + 1);
        if (literalMonth.length() == 1) {
            literalMonth = "0" + literalMonth;
        }
        assertThat(literalMonth).isEqualTo(randomDateString.substring(5, 7));
        // day control
        String literalDayh = String.valueOf(calendar.get(Calendar.DAY_OF_MONTH));
        if (literalDayh.length() == 1) {
            literalDayh = "0" + literalDayh;
        }
        assertThat(literalDayh).isEqualTo(randomDateString.substring(8));
    }
}

This is the same test, except that the test method is invoked 1000 times by 2 threads, producing this output:
Testsuite: org.javabenchmark.NonThreadSafeDateFormatHelperPerfTest
org.javabenchmark.NonThreadSafeDateFormatHelperPerfTest.shouldFormatRandomDatesConcurrently
samples: 999
max:     19
average: 0.04804804804804805
median:  0
Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0.377 sec
So, there is now an error, indicating that the DateFormatHelper component is not thread-safe: do not let it go into production :)

Summary

Good unit tests are not sufficient when you are writing components that will evolve in multi-threaded environment, like web applications. You can easily check if your code is vulnerable to race condition with contiperf.

Sunday, April 7, 2013

JVM Biased Locking and micro benchmark

This is a short article that focuses on HotSpot JVM Biased Locking feature. Before explaining what is it, let's take a simple example:
package org.javabenchmark;

public class SynchronizeCounter {
    
    private static long count = 0l;
    
    public synchronized void increment() {
        count = count + 1;
    }
    
    public long getCount() {
        return count;
    }
}
This class manages a thread safe counter. The code uses the synchronized keyword to ensure that only one thread at a time can invoke the increment() method.

Now, a trivial micro benchmark with only one thread incrementing the counter:
package org.javabenchmark;

import org.junit.Test;

public class OneThreadCounterTest {

    @Test
    public void incrementCounterLoop() {
        
        SynchronizeCounter counter = new SynchronizeCounter();
        long time = System.currentTimeMillis();
        for (int i = 0; i < 10000000; i++) {
            counter.increment();
        }
        time = System.currentTimeMillis() - time;
        System.out.printf("Counter: %d - Elapsed time: %d ms\n", counter.getCount(), time);
    }
}

The output is: Counter: 10000000 - Elapsed time: 67 ms

Next, the same test but with a 4.5s wait before:
package org.javabenchmark;

import org.junit.Test;

public class OneThreadCounterTest {

    @Test
    public void incrementCounterLoop() throws InterruptedException {

        Thread.sleep(4500);
        
        SynchronizeCounter counter = new SynchronizeCounter();
        long time = System.currentTimeMillis();
        for (int i = 0; i < 10000000; i++) {
            counter.increment();
        }
        time = System.currentTimeMillis() - time;
        System.out.printf("Counter: %d - Elapsed time: %d ms\n", counter.getCount(), time);
    }
}

The output is: Counter: 10000000 - Elapsed time: 12 ms
Wow, that is a big improvement considering that the only difference between the two tests is the line with Thread.sleep(..).
So, what happened ?

In the second case, the code benefits from the biased locking: we are in the context of un-contended lock, meaning that a lock mechanism is used but without any concurrency (single thread). In that case, the JVM uses biased locking and the running thread does not have to rely on expensive atomic instructions to acquire the lock.

Biased locking is enabled by default but only after 4 seconds, that is why the second test benefits from it. You can suppress this 4 seconds wait with the following JVM option:
-XX:BiasedLockingStartupDelay=0

Conclusion

When dealing with un-contended micro-benchmark be aware of the biased locking startup delay and use the previous JVM option to disabled it.

Saturday, April 6, 2013

Web Services Performance Testing with JMeter

This article explains how to measure the performance of your JEE6 web services with JMeter.

Web Service

The web service is a trivial one, that echoes its text input as illustrated by the following code:
package org.javabenchmark;

import javax.jws.WebMethod;
import javax.jws.WebParam;
import javax.jws.WebService;

@WebService(serviceName = "EchoWebService")
public class EchoWebService {

    @WebMethod
    public String echo(@WebParam(name="text") String text) {
        return "Echoing: " + text;
    }
}

Use your favorite JEE6 application server (Glassfish, JBoss, TomEE, Weblogic, WebSphere, ..) to deploy a simple application that exposes the previous web service (this step is beyond the scope of this article).

The endpoint of the service can be constructed by concatenating the application's context and the name of the service. For instance, if the application is deployed to http://localhost:8080/jee6, then the web service endpoint will be: http://localhost:8080/jee6/EchoWebService

JMeter Installation

First, you need to download JMeter there. Next, you need a mandatory JMeter plugin : jmeter-plugins.

Unzip the JMeter archive. Copy the JMeterPlugins.jar file of the plugin archive in the lib/ext directory of your JMeter installation directory.

Test Plan

Now, you are ready to generate some SOAP requests. To achieve this, launch JMeter, right click on the Test Plan icon and choose Add > Threads (Users) > jp@gc - Stepping Thread Group :




Then, set the number of threads to 30, to enable 30 concurrent threads as illustrated below:


So, the performance test will start 10 threads, then waits 30 seconds, and starts again 10 threads until reaching 30 threads. Then it will hold the load for 1 minute.

To invoke the web service we need to set the content type and a SOAP envelope to call the service.

First, add a HTTP Header Manager to set the content type : right-click on the Stepping thread group and choose Add > Config ElementHTTP Header Manager.



Then, click the Add button at the bottom and fill the first row of the HTTP Header table with the following values :
  • Name column: content-type
  • Value column: text/xml





To execute the call to the service web, a HTTP request sampler is needed : right-click on the Stepping thread group and choose Add > Sampler > HTTP Request.




Next, configure the HTTP Request sampler like this:

  • impl√©mentation : HttpClient4
  • Method: POST
  • Path: http://localhost:8080/jee6/EchoWebService (replace with your endpoint address)
  • Post Body:





TEXT ${__counter(FALSE)}




Replace the previous XML with the one corresponding to the web service you are testing. You can use the WSDL and the XSD provided by the service. In our case, both are given with the following addresses:
(adapt those URLs with the ones provided by your application server)





To check that everything goes right, add a View Results Tree sampler : right-click on the Stepping thread group and choose Add > Sampler > View Results Tree.



So, the next step is to verify that the test plan runs correctly and that the web service is responding (your web service must be up and running):

  • Click the start button:

  • And check in the view results tree that every request succeeds:



EDIT: Do not use the View Results Tree during the load test. Just click the Error check-box or disable the Results Tree.

Everything is fine, so it is time to collect some metrics about the performance of the targeted web and to achieve this, we need to add some listener to our test plan:

  • Response time: right-click on the Stepping thread group and choose Add > Listener > jp@gc - Response Times Over Time.
  • Request per second: right-click on the Stepping thread group and choose Add > Listener > jp@gc - Transactions per Second.
  • Response time's Percentiles: right-click on the Stepping thread group and choose Add > Listener > jp@gc - Response Times Percentiles.

It is up to you to pick up from the numerous listeners available !

Monitoring

The last step is to start again the test plan and to monitor graphs provided by the 3 listeners previously added:

Response times
Requests per second

Response times percentiles