Compiler aware access to properties files in Java

The standard way in the Java world to provide internationalization to your application is the usage of properties files. You create a simple text file where you store your key/value pairs. Access to these files is normally done through the JDK class ResourceBundle:

ResourceBundle myResources = ResourceBundle.getBundle("MyResources", currentLocale);
myResources.getString("OkKey");

So far, so good. But what if your project grows? How do you keep track of the link between the Java code, i.e. means all the lines that access a particular key, and the properties files? You have to find a way to cope with this, because otherwise it becomes difficult to answer questions like: “Can I remove this key/value pair from my properties file or is it still referenced by some Java code?”. Or what if you want to rename a key? You’ll have to find all occurrences of this string in the Java code. Hopefully you will find all, otherwise ResourceBundle will throw a MissingResourceException.
A simple concept to overcome this problem is to route all access to your properties files to Java interfaces. Let a proxy implementation of this interface fetch the required key from the properties file. This way your IDE can help you to find all lines in your Java code where a certain property is accessed:

MyMessageResource myMessageResource = JB5n.createInstance(MyMessageResource.class);
String ok = myMessageResource.ok();

The Google Web Toolkit (GWT) has introduced this mechanism as Messages. As I needed this kind of functionality quite often, I have implemented a library that implements this idea. In contrast to other implementations I wanted to be backward compatible, so that you can upgrade an existing application step by step. For this I have added an annotation to the interface methods that lets you define the key to use to access the properties file. Normally the name of the method is used as key.

@Message(key = "no.default.key")
String noDefaultKey();

Beyond that the library should also stay extensible. If your requirements change and your customer wants to be able to change the translations without the need of recompilation, you could easily implement your own InvocationHandler that loads the messages e.g. from a database or some other kind of storage:

@MessageResource(invocationHandler=MyDatabaseMessageResource.class)
private interface MyInvocationHandler {
	String ok();
}

Like Google’s GWT my implementation can of course also handle arguments to the message, using Java’s MessageFormat:

public interface MyMessageResource {
	String youHaveNREtries(int numberOfRetries); // "You have {0} retries."
}

And last but not least the jb5n library also allows you to use inheritance to group the messages/translations and distribute the methods over different interfaces and therewith the messages over different files:

public interface MyMessageResource {
	String ok();
}
public interface MySpecificMessageResource extends MyMessageResource {
	String specificMessage();
}

But the library is not yet finished. A maven plugin as well as an ant task are on my todo list. This way you can check that interface and properties file are in sync during the build process.
The source code can be found on github: https://github.com/siom79/jb5n.

Implementing a custom JSF 2.0 component with maven

Some time ago, I have written my own custom JSF component. But at that point in time, JSF 1.0 was still up to date and the project didn’t use maven as build system. Thus, I always wanted to write a custom JSF2 component with maven. So let’s start:

First of all we setup a maven project with two modules. Here is the pom.xml file of the parent project:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<groupId>martins-developer-world</groupId>
	<artifactId>jsf-component</artifactId>
	<packaging>pom</packaging>
	<version>0.0.1-SNAPSHOT</version>
	<name>jsf-component Maven Webapp</name>
	<dependencies>
		<dependency>
			<groupId>junit</groupId>
			<artifactId>junit</artifactId>
			<version>4.11</version>
			<scope>test</scope>
		</dependency>
		<dependency>
			<groupId>javax.faces</groupId>
			<artifactId>jsf-api</artifactId>
			<version>2.1</version>
			<scope>provided</scope>
		</dependency>
		<dependency>
			<groupId>com.sun.faces</groupId>
			<artifactId>jsf-impl</artifactId>
			<version>2.2.0</version>
			<scope>provided</scope>
		</dependency>
		<dependency>
			<groupId>javax.servlet</groupId>
			<artifactId>servlet-api</artifactId>
			<version>2.5</version>
			<scope>provided</scope>
		</dependency>
		<dependency>
			<groupId>javax.servlet</groupId>
			<artifactId>jsp-api</artifactId>
			<version>2.0</version>
			<scope>provided</scope>
		</dependency>
		<dependency>
			<groupId>javax.servlet</groupId>
			<artifactId>jstl</artifactId>
			<version>1.2</version>
			<scope>provided</scope>
		</dependency>
	</dependencies>
	<build>
		<finalName>jsf-component</finalName>
	</build>
	<modules>
		<module>jsf-component-webapp</module>
		<module>jsf-component-impl</module>
	</modules>
</project>

As you can see, we have added the JSF dependencies in the top level pom.xml, so that we inherit them in the child modules. As we will use the JBoss Application Server to test our web application, we have to set the scope for the maven dependencies to provided, so that our war file and our component jar won’t deploy them.
The implementation of our component will reside in jsf-component-impl, thus we chose jar as packaging type for this module:

<?xml version="1.0"?>
<project
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
	xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
	<modelVersion>4.0.0</modelVersion>
	<parent>
		<groupId>martins-developer-world</groupId>
		<artifactId>jsf-component</artifactId>
		<version>0.0.1-SNAPSHOT</version>
	</parent>
	<artifactId>jsf-component-impl</artifactId>
	<name>jsf-component-impl</name>
	<properties>
		<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
	</properties>
	<dependencies>
	</dependencies>
</project>

Now let’s implement a Java class that extends UIOutput. I have chosen UIOutput because as a first step I just want to implement a simple helloWorld tag, that prints the first and last name given as attribute within a span element. As this component doesn’t receive any input, UIOutput it appropriate:

package martins.developer.world.jsf.component.impl;

import java.io.IOException;

import javax.faces.application.ResourceDependencies;
import javax.faces.application.ResourceDependency;
import javax.faces.component.FacesComponent;
import javax.faces.component.UIOutput;
import javax.faces.context.FacesContext;
import javax.faces.context.ResponseWriter;

@ResourceDependencies({ @ResourceDependency(name = "css/jsf-component.css", target = "head") })
@FacesComponent("HelloWorld")
public class HelloWorldComponent extends UIOutput {
	private static final String COMPONENT_FAMILY = "martins.developer.world.jsf.component.helloWorld";

	private enum PropertyKeys {
		firstName, lastName
	};

	@Override
	public String getFamily() {
		return COMPONENT_FAMILY;
	}

	@Override
	public void encodeBegin(FacesContext context) throws IOException {
		ResponseWriter writer = context.getResponseWriter();
		writer.startElement("span", this);
		writer.writeAttribute("class", "helloWorldClass", "");
		writer.writeText(String.format("Hello %s %s!", getFirstName(), getLastName()), "");
		writer.endElement("span");
	}

	public String getFirstName() {
		return (String) getStateHelper().eval(PropertyKeys.firstName, "???firstName???");
	}

	public void setFirstName(String firstName) {
		getStateHelper().put(PropertyKeys.firstName, firstName);
	}

	public String getLastName() {
		return (String) getStateHelper().eval(PropertyKeys.lastName, "???lastName???");
	}

	public void setLastName(String lastName) {
		getStateHelper().put(PropertyKeys.lastName, lastName);
	}
}

The getFamily() method is the only method that we are enforced to implement. Interesting is here the method encodeBegin(). This is the place where we implement our span tag. As it should have a CSS class attribute, we add it with the writeAttribute() method of the Writer. The two attributes of the resulting JSF tag are modelled as simple properties with getter and setter methods. The implementation of these getters and setters uses the StateHelper available in JSF 2.0. In encodeBegin() we use the getters to retrieve the value given by the user.
Interesting is also the annotation @ResourceDependencies. With this annotation we can tell the JSF framework that we have some files we depend on. In this case it is a CSS file that resides with the folder src/main/resources/META-INF/resources/css.
The annotation @FacesComponent registers this component in the boot process at the JSF framework. The given name is used in the taglib file to reference this class:

<?xml version="1.0"?>
<facelet-taglib xmlns="http://java.sun.com/xml/ns/javaee">
	<namespace>https://martinsdeveloperworld.wordpress.com</namespace>
	<tag>
		<tag-name>helloWorld</tag-name>
		<component>
			<component-type>HelloWorld</component-type>
		</component>
	</tag>
</facelet-taglib>

In this taglib file under src/main/resources/META-INF we define the available components, here only our helloWorld tag. The attributes of the tag are derived from the properties of the Java class.
Finally we want to test our newly created component. To be able to do this, we setup a simple JSF2 webapp project and add the following snippet to the web.xml, in order to declare that we want to use our custom component:

	<context-param>
		<param-name>facelets.FACELETS_LIBRARIES</param-name>
		<param-value>/META-INF/jsf-component.taglib.xml</param-value>
	</context-param>

Now we can write a simple JSF page that references our new tag:

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
   "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml"
	xmlns:f="http://java.sun.com/jsf/core"
	xmlns:h="http://java.sun.com/jsf/html"
	xmlns:mdw="https://martinsdeveloperworld.wordpress.com">
<h:head>
<title>Hello JSF 2!</title>
</h:head>
<h:body>
	<h2>Hello World!</h2>
	<mdw:helloWorld firstName="Martin" lastName="Developer"/>
</h:body>
</html>

When we deploy this application to the JBoss Application Server and call the corresponding URL, we get the following HTML output:

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
	<title>Hello JSF 2!</title>
	<link type="text/css" rel="stylesheet" href="/jsf-component-webapp/faces/javax.faces.resource/css/jsf-component.css" />
</head>
<body>
	<h2>Hello World!</h2>
	<span class="helloWorldClass">Hello Martin Developer!</span>
</body>
</html>

Clearly we can see the span tag with the CSS class and the output. The CSS file is referenced in the head of the HTML document.

The sources of the whole project can be found on GitHub: https://github.com/siom79/jsf-component.

Benchmarking the famous refactoring of GeneratePrimes in Robert C. Martin’s ‘Clean Code’

Inspired by Tomasz Nurkiewicz’ blog post about how aggressive the inlining capability of the Java Virtual Machine is (original blog post), I asked myself what impact the refactorings to the famous GeneratePrimes class in Robert C. Martin’s book ‘Clean Code’ (see page 71) are. Therefore I set up a small benchmark project that can be found here on GitHub: https://github.com/siom79/generate-primes-cleancode-benchmark.

Under src/main/java you will find the two classes as they are published in the book and under src/test/java a unit test that runs the generatePrimes() method of the two classes with different values for the argument maxValue, ranging from 10 to 1.000.000.000. The unit test prints out the measured time in milliseconds like this:

OneMethod : 1083,00 ms for 10
PlentyMethods: 490,17 ms for 10
OneMethod/PlentyMethods: 2,21 (< 1 means OneMethod is faster)
OneMethod : 12,40 ms for 100
PlentyMethods: 19,68 ms for 100
OneMethod/PlentyMethods: 0,63 (< 1 means OneMethod is faster)
OneMethod : 125,75 ms for 1000
PlentyMethods: 179,22 ms for 1000
OneMethod/PlentyMethods: 0,70 (< 1 means OneMethod is faster)
OneMethod : 1386,68 ms for 10000
PlentyMethods: 2069,76 ms for 10000
OneMethod/PlentyMethods: 0,67 (< 1 means OneMethod is faster)
OneMethod : 10096,85 ms for 100000
PlentyMethods: 10365,03 ms for 100000
OneMethod/PlentyMethods: 0,97 (< 1 means OneMethod is faster)
OneMethod : 26232,72 ms for 1000000
PlentyMethods: 8215,29 ms for 1000000
OneMethod/PlentyMethods: 3,19 (< 1 means OneMethod is faster)
OneMethod : 184577,74 ms for 10000000
PlentyMethods: 184149,59 ms for 10000000
OneMethod/PlentyMethods: 1,00 (< 1 means OneMethod is faster)
OneMethod : 2100707,11 ms for 100000000
PlentyMethods: 2103006,56 ms for 100000000
OneMethod/PlentyMethods: 1,00 (< 1 means OneMethod is faster)

The logging statement beginning with OneMethod is the old-styled implementation which computes all prime numbers up to the given maximum value with only one method, whereas the statement beginning with PlentyMethods uses the refactored version using plenty methods for the implementation of the same algorithm. As you will notice, the difference between both implementations converges against 1.00. This means that after some time all private methods of the refactored implementation have been inlined and do not cause any runtime overhead. Until 100.000 the one method implementation is faster, afterwards (interestingly except for the 1.000.000 measurement) the difference in runtime is 1. Surprisingly the first measurement shows up that the plenty method implementation is even faster than the old-styled implementation.

PS: The measurement were taken on the following setup: Intel Core i5, 2.4 GHz, 4GB RAM, 64Bit; Windows 7, Java(TM) SE Runtime Environment (build 1.7.0_21-b11).

Analyzing deadlocks in Java applications with thread dumps

Recently I was given the task to analyze a problem with some multi-threaded Java application that got stuck if you run that application within a batch script again and again. On some invocations the application just did nothing (no CPU utilization, no exceptions). So how to analyze what the application is doing right now?
In cases like this, jstack enables you to create a thread dump for a running java application and gives a look inside. First you have to look up the id of the running application with jps and then provide its PID as the first argument to jstack:

C:\Users>jps
884 org.eclipse.equinox.launcher_1.3.0.v20120522-1813.jar
2344 Jps
5188 ThreadA

C:\Users>jstack 5188 > stack.txt

As I opened the text file stack.txt, I found something like the following (reconstructed here with a simple example):

"Thread-18":
	at martin.ThreadB.run(ThreadB.java:13)
	- waiting to lock <0x00000000ec102070> (a java.lang.Object)
"Thread-0":
	at martin.ThreadB.run(ThreadB.java:15)
	- waiting to lock <0x00000000ec103b68> (a java.lang.Object)
	- locked <0x00000000ec102070> (a java.lang.Object)
"Thread-15":
	at martin.ThreadA.run(ThreadA.java:15)
	- waiting to lock <0x00000000ec102070> (a java.lang.Object)
	- locked <0x00000000ec103b68> (a java.lang.Object)

As you can see, Thread-18 is waiting to lock 0x00000000ec102070, which is already locked by Thread-0. Thread-0 on the other hand is waiting to lock 0x00000000ec103b68, which again is already locked (by Thread-15). And Thread-15 is also waiting to lock 0x00000000ec102070. Thus, we have a classic deadlock situation.

Using interfaces to subdivide enums into smaller semantic units

Recently I stumbled upon a huge enum with a lot of values. While working with this enum I realized, that in some situations you only needed a few enum values while in other situations you needed other values. This clearly indicated that the enum subsumed values from different semantic fields.
But how can you subdivide such a enum into separate units?
A first thought was to establish some kind of inheritance. But Java does not support inheritance for enums. The reason why is shown in the following example:

 enum First {One, Two}   
 enum Second extends First {Three, Four}   

 First a = Second.Four;   // clearly illegal 

But instead of using inheritance one could create separate enums and define a common interface for them:

interface Os {

}
enum WindowsOs implements Os {
    Windows95,
    Windows98,
    ...
}
enum UnixOs implements Os {
    Linux,
    Solaris,
    ...
}

Now you can write methods, that only accept operating systems of type WindowsOs or UnixOs or any Os:

	private boolean isWindowsOs(Os os) {
		return (os instanceof WindowsOs);
	}

	private void rebootAfterInstallation(WindowsOs windowsOs) {
		System.out.println("Rebooting");
	}

Even if you want to refine the UnixOs enum, you could create one more interface (e.g. UnixOs that extends Os) and create enums like LinuxOs and SolarisOs that implement the new interface:

interface UnixOs extends Os {

}
enum LinuxOs implements UnixOs {
    CentOs,
    RedHat,
    Debian,
    ...
}

JNDI load balancing with jboss-ejb-client on JBoss AS 7

Let’s assume we have a client application that accesses a remote stateful session bean (SFSB) on a JBoss Application Server AS 7. The SFSB is accessed via JNDI lookup as described here: EJB invocations via JNDI. The SFSB is clustered via the @Clustered annotation as described here: Clustered EJBs.

@Stateful
@Clustered
@Remote(TestRemote.class)
public class TestBean implements TestRemote {
...
}

Note that the infinispan cache used to cluster the SFSB is only started, when during the deployment one SFSB with the @Clustered annotation is found. If we want to start two JBoss servers on the same machine, we can do this with the following command line invocations:

standalone.bat -c standalone-ha.xml -Djboss.node.name=nodeA -b 127.0.0.1
standalone.bat -c standalone-ha.xml -Djboss.socket.binding.port-offset=100 -Djboss.node.name=nodeB -b 127.0.0.1

It is important to note, that both servers have to be bind to a specific ip address. If you bind both server with the option “-b 0.0.0.0” the clustering doesn’t start (see here). Both servers also do have to have a different node name.

The client uses the following properties file to access the SFSB via JNDI:

remote.connectionprovider.create.options.org.xnio.Options.SSL_ENABLED=false
invocation.timeout=10000
remote.connections=default

remote.connection.default.host=127.0.0.1
remote.connection.default.port=4447
remote.connection.default.connect.options.org.xnio.Options.SASL_POLICY_NOANONYMOUS=false
remote.connection.default.connect.options.org.xnio.Options.SASL_POLICY_NOPLAINTEXT=false

remote.clusters=ejb
remote.cluster.ejb.connect.options.org.xnio.Options.SASL_POLICY_NOANONYMOUS=false
remote.cluster.ejb.connect.options.org.xnio.Options.SASL_POLICY_NOPLAINTEXT=false
remote.cluster.ejb.connect.options.org.xnio.Options.SSL_ENABLED=false

Two things are important. First of all we only have to define one of the two servers (here 127.0.0.1:4447). The other server is detected automatically via a topology message that the client receives after he has connected to the first server. It is also important to mention that this topology information is received with some latency, thus if you try to lookup all your SFSBs directly after the first lookup, your client program could be to fast to integrate the information about the second server and therefore all SFSB are executed on the first server. The name of the cluster has also to be defined (here with the property remote.clusters). Then for each defined cluster (here ejb) the SASL_POLICY as well as the SSL configuration is given.

If you now look up the remote bean with the following code, all invocations are load balanced to both server instances:

	private TestRemote lookupRemoteBean() throws NamingException {
		logger.info("Using jboss-ejb-client.");
		final Hashtable<String, String> jndiProperties = new Hashtable<String, String>();
		jndiProperties.put(Context.URL_PKG_PREFIXES, "org.jboss.ejb.client.naming");
		final Context context = new InitialContext(jndiProperties);
		final String appName = "jboss-ejb-client";
		final String moduleName = "server-ejb";
		final String distinctName = "";
		final String beanName = TestBean.class.getSimpleName();
		final String viewClassName = TestRemote.class.getName();
		String lookupString = "ejb:" + appName + "/" + moduleName + "/" + distinctName + "/" + beanName + "!"
				+ viewClassName + "?stateful";
		logger.debug(String.format("Looking up: %s", lookupString));
		return (TestRemote) context.lookup(lookupString);
	}

The appName and moduleName are chosen as described here.

There is also one more caveat: You can set a selector for the EJB client context to use programmatically:

final EJBClientConfiguration ejbClientConfiguration = new PropertiesBasedEJBClientConfiguration(
				clientConfigProps);
final ContextSelector<EJBClientContext> ejbClientContextSelector = new ConfigBasedEJBClientContextSelector(
				ejbClientConfiguration);
EJBClientContext.setSelector(ejbClientContextSelector);

Here the properties object clientConfigProps is created dynamically during runtime and contains in our example the same information as the properties file from above. If you set this selector directly before each lookup, the topology information is requested again and arrives too late, due to the latency mentioned before. Therefore again all clients are executed on the first server.

Executing JavaScript on a Java VM

These days I stumbled upon an interesting feature of the Java Virtual Machine. Since version 1.6 you can request from your Java code a ScriptEngineManager. This factory allows you to request an execution engine for JavaScript, which can be used to execute arbitrary JavaScript code within your Java program.

The following snippet demonstrates how to execute a string containing JavaScript code:

ScriptEngineManager factory = new ScriptEngineManager();
ScriptEngine engine = factory.getEngineByName("JavaScript");
if (engine != null) {
  try {
    engine.eval(script);
  } catch (ScriptException e) {
    throw new JsExecutionException(e);
  }
} else {
  throw new JsExecutionException("No JavaScript engine available.");
}

To try this JVM feature out, I implemented a small java application that executes a JavaScript given as command line parameter. You can find the sources on github: https://github.com/siom79/js-executor.git.

Move all files in the current directory to an archive folder via batch script

Sometimes you work on some files within a directory and want to do a backup from time to time. Of course, if you have a version control system like git or svn by hand, you might use this. But if you have no versioning tool, a simple technique often used is to copy all files to an archive folder in the current directory and append a version string to the filename. If you do this a few times after another, you might want to write a small batch script, that does this stupid job for you.

Today I found myself in this situation, thus I started writing a small batch script that copies all files except the batch file itself to the an archiv folder, appending a timestamp to the filename:


set archivefoldername=Archiv

set hour=%time:~0,2%
if "%hour:~0,1%" == " " set hour=0%hour:~1,1%
echo hour=%hour%
set min=%time:~3,2%
if "%min:~0,1%" == " " set min=0%min:~1,1%
echo min=%min%
set secs=%time:~6,2%
if "%secs:~0,1%" == " " set secs=0%secs:~1,1%
echo secs=%secs%

set year=%date:~-4%
echo year=%year%
set month=%date:~3,2%
if "%month:~0,1%" == " " set month=0%month:~1,1%
echo month=%month%
set day=%date:~0,2%
if "%day:~0,1%" == " " set day=0%day:~1,1%
echo day=%day%
set datetimef=%year%-%month%-%day%_%hour%%min%%secs%

if not exist %archivefoldername% mkdir %archivefoldername%

for %%i in (*) do (
 if not %~n0 == %%~ni (
  copy %%i %archivefoldername%\%%~ni_%datetimef%%%~xi
 )
)

Apache ActiveMQ and Tomcat

Today I want to investigate how to integrate JMS functionality into a web application running within a Tomcat servlet container (7.x). As we do not use a fully fledged application server, we have to care about registering the ConnectionFactory and Queue within the JNDI context. In this exmple I chose Apache ActiveMQ as a JMS provider. As a simple example, I want to have two Servlets: The first Servlet pushes messages into a queue, whereas the second Servlet pulls them out of the queue and displays the messages as an HTML page.

Let’s add the necessary maven dependencies to our pom.xml:

		<dependency>
			<groupId>org.apache.activemq</groupId>
			<artifactId>activemq-core</artifactId>
			<version>5.7.0</version>
		</dependency>
		<dependency>
			<groupId>javax</groupId>
			<artifactId>javaee-web-api</artifactId>
			<version>6.0</version>
			<scope>provided</scope>
		</dependency>

As Tomcat has only a static JNDI, we add the ConnectionFactory and the Queue via the context.xml:

<?xml version='1.0' encoding='utf-8'?>
<Context>
    <Resource
            name="jms/ConnectionFactory"
            auth="Container"
            type="org.apache.activemq.ActiveMQConnectionFactory"
            description="JMS Connection Factory"
            factory="org.apache.activemq.jndi.JNDIReferenceFactory"
            brokerURL="vm://localhost"
            brokerName="LocalActiveMQBroker"
            useEmbeddedBroker="true"/>

    <Resource name="jms/queue/MyQueue"
              auth="Container"
              type="org.apache.activemq.command.ActiveMQQueue"
              factory="org.apache.activemq.jndi.JNDIReferenceFactory"
              physicalName="MY.TEST.FOO.QUEUE"/>
</Context>

Let’s setup a simple Servlet that listens to the URL /sendMessage:

@WebServlet(name = "SendMessageServlet", urlPatterns = "/sendMessage")
public class SendMessageServlet extends HttpServlet {

    private static final Logger logger = LoggerFactory.getLogger(SendMessageServlet.class);

    @Override
    protected void doGet(HttpServletRequest httpServletRequest, HttpServletResponse httpServletResponse) throws ServletException, IOException {
    	logger.info("doGet() called");
    	String parameter = getTextParameter(httpServletRequest);
        sendMessage(parameter);
        writeResponse(httpServletResponse, parameter);
    }

[...]

}

The text to send is given as a GET-Parameter to the URL. The method sendMessage() contains the relevant code. Here we can access the ConnectionFactory through our JNDI Context:

            InitialContext initCtx = new InitialContext();
            ConnectionFactory connectionFactory = (ConnectionFactory) initCtx.lookup("java:comp/env/jms/ConnectionFactory");
            Connection connection = connectionFactory.createConnection();
            Session session = connection.createSession(false, Session.AUTO_ACKNOWLEDGE);
            MessageProducer producer = session.createProducer((Destination) initCtx.lookup("java:comp/env/jms/queue/MyQueue"));

Thus, sending a text message to the queue is quite simple:

            TextMessage testMessage = session.createTextMessage();
            testMessage.setText(text);
            testMessage.setStringProperty("aKey", "someRandomTestValue");
            producer.send(testMessage);

Now it is time to setup a second Servlet that enables us to read all messages from the queue. The code for the sceleton is quite similar to the first Servlet:

@WebServlet(name = "ReceiveMessageServlet", urlPatterns = "/receiveMessage")
public class ReceiveMessageServlet extends HttpServlet {

	private static final Logger logger = LoggerFactory.getLogger(ReceiveMessageServlet.class);

	@Override
	protected void doGet(HttpServletRequest httpServletRequest, HttpServletResponse httpServletResponse)
			throws ServletException, IOException {
		logger.info("doGet() called");
		Optional<String> text = receiveMessages();
		writeRepsonse(httpServletResponse, text);
	}

[...]
}

In our receiving Servlet we use the following code to create a QueueReceiver:

InitialContext initCtx = new InitialContext();
QueueConnectionFactory connectionFactory = (QueueConnectionFactory) initCtx.lookup("java:comp/env/jms/ConnectionFactory");
QueueConnection queueConnection = connectionFactory.createQueueConnection();
QueueSession queueSession = queueConnection.createQueueSession(false, Session.AUTO_ACKNOWLEDGE);
Queue queue = (Queue) initCtx.lookup("java:comp/env/jms/queue/MyQueue");
QueueReceiver receiver = queueSession.createReceiver(queue);

Finally, we can receive the text message, we have sent before:

queueConnection.start();
try {
	Message m = receiver.receive(1000);
	if (m != null && m instanceof TextMessage) {
		TextMessage tm = (TextMessage) m;
		text = Optional.of(tm.getText());
		logger.debug(String.format("Received TextMessage with text '%s'.", text));
	} else {
		logger.debug(String.format("No TextMessage received: '%s'", m));
	}
} finally {
	queueSession.close();
	queueConnection.close();
}

The whole source code can be found on github: https://github.com/siom79/jms-and-tomcat.

Design a site like this with WordPress.com
Get started