Hi there,
we are working on an application with about 150 entities and a class hirachie where a certain amount of different "Dokuments" are mapped to a "Table per Class" - DB mapping.
In one special case we have to do a batchupdate like the code below.
I set the hibernate.jdbc.batch_size to 50 to use the advantage of JDBC batching.
My question is, does this configuration has any side effects to the rest of my application or could they appear somewhere?
Do you have any experience with these kind of batchupdates?
Hibernate version: 3.0
Code between sessionFactory.openSession() and session.close():
Session session = getHibernateSession();
try{
Transaction transaction = session.beginTransaction();
Criteria criteria = getHibernateSession().createCriteria(getDaoType());
addRestriction4Serie(criteria, serieId);
if ( suchtext != null ) {
criteria.add(Restrictions.ilike("titel", suchtext));
}
criteria.add(Restrictions.ne("boostFaktor",new Float(boostFaktor)));
long startTime = System.currentTimeMillis();
ScrollableResults results = criteria.setCacheMode(CacheMode.IGNORE).scroll(ScrollMode.FORWARD_ONLY);
long count = 0;
while(results.next()){
Dok dok = (Dok)results.get(0);
dok.setBoostFaktor(boostFaktor);
session.saveOrUpdate(dok);
if((++count % 50) == 0){
session.flush();
session.clear();
}
}
transaction.commit();
if(LOG.isInfoEnabled())
LOG.info("Updated "+count+" Rows; Run batchupdate in " +((System.currentTimeMillis()-startTime)/1000)+" seconds");
}catch (Throwable e) {
DAOException ex = new DAOException("Exception in "
+ getClass().getName() + ".setBoostFaktor() " + e.getMessage());
ex.setStackTrace(e.getStackTrace());
throw ex;
}finally{
session.close();
}
|