Seam provides jbpm for handling workflows, but doesn’t have anything dedicated so far to the issue of batch processing. While jbpm will handle complex business conditional steps, some steps are repetitive rather than complex and are better suited to a batch framework, like Spring Batch. In order to use Spring Batch with Seam, you will first need to follow the steps to configure Seam to start up the Spring container and make the required Spring beans accessible as Seam components.

These steps describe adding a very simplistic batch job and calling it from a Seam class. As a simple batch example, the database tables usually used by Spring batch will be held in memory and not persisted to a database.

The first step in the batch process is to upload a csv file and extract the list of people contained in it. The test file is:

title,firstname,surname,dateOfBirth,country
Mr,John,Smith,1949,United States
Ms,Jane,Smith,03/09/1994,England
Mr,David,Moore,13/04/1972,France
Mr,Robert,Fisher,26/06/1946,Brazil

You need to add the following jar files to your ear file (in addition to the ones here.

spring/org.springframework.batch.core-2.0.0.RELEASE.jar
spring/org.springframework.batch.infrastructure-2.0.0.RELEASE.jar
spring/aopalliance-1.0.jar
spring/org.springframework.aop-3.0.0.RELEASE.jar
spring/org.springframework.jdbc-3.0.0.RELEASE.jar
spring/org.springframework.transaction-3.0.0.RELEASE.jar

You should create a new folder to hold the configuration files for your batch jobs on the class path, eg. WEB-INF/classes/exampleJob.xml. You can import these files into your applicationContext.xml.


Rather than persisting information about the batch job to a database, you can define an in-memory data source to hold the information in applicationContext.xml. You need a job repository and a transaction manager defined.


The exampleJob.xml contains the configuration for the batch job itself:













In this file we define the exampleJob bean which is a java class called ExampleJob. This is a spring class that is wrapped as a seam component. We also define a job launcher that uses the job repository already defined in applicationContext.xml.

The batch job called personUpload contains the step personload which defines a reader and writer for the data. To read the csv file we use Spring Batch’s FlatFileItemReader and tell it the names of the fields that are contained in the file. It uses the ExampleFieldSetMapper class to do the mapping. We also define an exampleWriter bean that will be used to display the data uploaded in the logs.

The ExampleJob.java class simply launches the batch job.

public class ExampleJob {

@Autowired
private JobLauncher jobLauncher;

@Autowired
private Job job;

public void launchJob() throws Exception {
jobLauncher.run(job, new JobParameters());
}
}

The ExampleFieldSetMapper.java takes the data read in from each line in the csv file and stores it as an UploadedPerson (a simple class containing the properties and getters/setters required).

public class ExampleFieldSetMapper implements FieldSetMapper {

public UploadedPerson mapFieldSet(FieldSet fs) {
if(fs == null){
return null;
}

UploadedPerson person = new UploadedPerson();
person.setTitle(fs.readString("title"));
person.setFirstname(fs.readString("firstname"));
person.setSurname(fs.readString("surname"));
person.setDateOfBirth(fs.readString("dateOfBirth"));
person.setCountry(fs.readString("country"));
return person;
}

}

ExampleWriter.java just logs the list of UploadedPerson objects to the log files:

public class ExampleWriter implements ItemWriter {

UploadWatchlistService watchlistService;

private static final Log log = LogFactory.getLog(ExampleWriter.class);

/**
* @see ItemWriter#write(Object)
*/
public void write(List data) throws Exception {
log.info(data);
}

}

Now in the seam component that calls the job we inject the ExampleJob and

@In(create=true)
private ExampleJob exampleJob;
...
try {
exampleJob.launchJob();
} catch (Exception e) {
log.info("There was an error: " + e);
}