I am running:
Hibernate 3.1.2
Postgresql 8.1.4
Tomcat 5.5.15
Using latest postgresql JDBC3 driver
I have a table with roughly 100M records. I am attempting to insert roughly 3M new records per day into this table, and would like to check for duplicates using a postgresql plsql before-insert trigger that returns NULL if the record already exists, or NEW (the record to be inserted) if it doesn't exist.
This works well, except that hibernate throws a stale state exception when the plsql trigger doesn't insert a record (because a duplicate was found):
Quote:
Location: org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:300) Message: Could not synchronize database state with session
StackTrace: org.hibernate.StaleStateException: Unexpected row count: 0 expected: 1 at org.hibernate.jdbc.NonBatchingBatcher.addToBatch(NonBatchingBatcher.java:27) at org.hibernate.persister.entity.AbstractEntityPersister.insert(AbstractEntityPersister.java:2062) at org.hibernate.persister.entity.AbstractEntityPersister.insert(AbstractEntityPersister.java:2427) at org.hibernate.action.EntityInsertAction.execute(EntityInsertAction.java:51) at org.hibernate.engine.ActionQueue.execute(ActionQueue.java:248) at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:232) at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:139) at org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:297) at org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:27) at org.hibernate.impl.SessionImpl.flush(SessionImpl.java:985) at org.hibernate.impl.SessionImpl.managedFlush(SessionImpl.java:333) at org.hibernate.transaction.JDBCTransaction.commit(JDBCTransaction.java:106)
How can I avoid this exception? Ideally I would like to batch up inserts as explained in the documentation, flushing and clearing every 50 records, but until I can stop throwing this exception I am doomed to a dismal 280 inserts per second :(
Anyone have experience with this? I can catch the excpetion in my DAO by flushing explicitly and everything appears to run OK, but it prevents me from doing batched inserts and has become quite the performance hit. It's a versioned class, here is the relevant mapping:
Code:
<hibernate-mapping package="com.listmanager.data.model.email">
<class name="EmailRecord"
table="EMAIL_RECORD">
<id name="id"
column="EMAIL_RECORD_ID"
type="java.lang.Long">
<generator class="sequence">
<param name="sequence">email_record_id_seq</param>
</generator>
</id>
<version name="version"
column="version"
type="java.lang.Integer" />
And here is the relevant DAO code:
Code:
public void saveEmailRecord( EmailRecord emailRecord )
{
getHibernateTemplate().save( emailRecord );
try{
getHibernateTemplate().flush();
getHibernateTemplate().clear();
}
catch(Exception ex){
log.info("flush failed for " + emailRecord.getEmailAddress());
}
}
In addition, I have implemented the seqhilo generator as well, the same exception is thrown. How can I trick Hibernate into not throwing an exception when the trigger catches a duplicate and doesn't insert it?