Seam provides jbpm for handling workflows, but doesn’t have anything dedicated so far to the issue of batch processing. While jbpm will handle complex business conditional steps, some steps are repetitive rather than complex and are better suited to a batch framework, like Spring Batch. In order to use Spring Batch with Seam, you will first need to follow the steps to configure Seam to start up the Spring container and make the required Spring beans accessible as Seam components.

These steps describe adding a very simplistic batch job and calling it from a Seam class. As a simple batch example, the database tables usually used by Spring batch will be held in memory and not persisted to a database.

The first step in the batch process is to upload a csv file and extract the list of people contained in it. The test file is:

Mr,John,Smith,1949,United States

You need to add the following jar files to your ear file (in addition to the ones here.


You should create a new folder to hold the configuration files for your batch jobs on the class path, eg. WEB-INF/classes/exampleJob.xml. You can import these files into your applicationContext.xml.

Rather than persisting information about the batch job to a database, you can define an in-memory data source to hold the information in applicationContext.xml. You need a job repository and a transaction manager defined.

The exampleJob.xml contains the configuration for the batch job itself:

In this file we define the exampleJob bean which is a java class called ExampleJob. This is a spring class that is wrapped as a seam component. We also define a job launcher that uses the job repository already defined in applicationContext.xml.

The batch job called personUpload contains the step personload which defines a reader and writer for the data. To read the csv file we use Spring Batch’s FlatFileItemReader and tell it the names of the fields that are contained in the file. It uses the ExampleFieldSetMapper class to do the mapping. We also define an exampleWriter bean that will be used to display the data uploaded in the logs.

The class simply launches the batch job.

public class ExampleJob {

private JobLauncher jobLauncher;

private Job job;

public void launchJob() throws Exception {, new JobParameters());

The takes the data read in from each line in the csv file and stores it as an UploadedPerson (a simple class containing the properties and getters/setters required).

public class ExampleFieldSetMapper implements FieldSetMapper {

public UploadedPerson mapFieldSet(FieldSet fs) {
if(fs == null){
return null;

UploadedPerson person = new UploadedPerson();
return person;

} just logs the list of UploadedPerson objects to the log files:

public class ExampleWriter implements ItemWriter {

UploadWatchlistService watchlistService;

private static final Log log = LogFactory.getLog(ExampleWriter.class);

* @see ItemWriter#write(Object)
public void write(List data) throws Exception {;


Now in the seam component that calls the job we inject the ExampleJob and

private ExampleJob exampleJob;
try {
} catch (Exception e) {"There was an error: " + e);