Every night our legacy systems process anywhere from 500 to 2000 policies, so each night we synchronize our database using
hibernate batch processing. Below is the general approach we are taking:
Code:
Session session = getSessionFactory().openSession();
Transaction tx = session.beginTransaction();
ScrollableResults policyMasters = session.createQuery(" ... ").setCacheMode(CacheMode.IGNORE).scroll(ScrollMode.FORWARD_ONLY);
int count = 0;
while (policyMasters.next()) {
PolicyMaster pm = (PolicyMaster) policyMasters.get(0);
result.addAttemptedBatchPolicy(pm.getInsurerPolicyNumber());
try{
pm = updatePolicy(pm, input);
} catch (Exception e){
result.addFailedPolicy(new PolicyBatchFailureInfo(pm.getInsurerPolicyNumber(), count, FailureType.TRANSFORMATION));
session.evict(pm);
}
if (++count % 25 == 0) {
session.flush();
session.clear();
}
}
tx.commit();
session.close();
I currently have one try/catch wrapped around the portion of code where I update the policy in memory.
Code:
try{
pm = updatePolicy(pm, input);
} catch (Exception e){
result.addFailedPolicy(new PolicyBatchFailureInfo(pm.getInsurerPolicyNumber(), count, FailureType.TRANSFORMATION));
session.evict(pm);
}
I catch the exception, make a note of it, and then evict it from the session. I then continue on processing the remaining policies in the block. After all blocks are finished processing I commit the transaction and close the session.
My question is, how do I handle exceptions thrown from session.flush or session.clear without having to roll back everything that has processed successfully? Currently I am doing 25 policies per session.flush(), and then doing the tx.commit() on all policies. Do I want to consider doing 25 policies per transaction. So for a batch of 100 policies, instead of having 1 transaction from start to end, I would have 4 transactions start to end. Advice?