Hello everybody,
First, thanks for the great project. I have used it now for years with very little problem, but now I upgraded my jdbc driver and hibernate version (Postgresql 8.4, Hibernate 3.6.3) and there is some trouble.
I used to be able to insert Hibernate created
java.sql.Blob objects into oid columns and stream data from a file into it.
Something like
Code:
class BlobWrap {
Long dbID;
Blob data;
// getters and setters
}
with a BlobWrap.hbm.xml like
Code:
<class name="gr.ntua.ivml.mint.persistent.BlobWrap" table="blob_wrap">
<id name="dbID" column="blob_wrap_id">
<generator class="sequence">
<param name="sequence">seq_blob_wrap_id</param>
</generator>
</id>
<property name="data" />
</class>
Typically I would do something like that to use it
Code:
FileInputStream fis = new FileInputStream( data );
someObject.blobWrap = new BlobWrap();
LobHelper lh = DB.getSession().getLobHelper();
someObject.blobWrap.setData(lh.createBlob(fis, (long) data.length()));
But with the updated Hibernate and jdbc driver I get the following exception on flushing my Session:
java.sql.BatchUpdateException: Batch entry 0 insert into blob_wrap (data, blob_wrap_id) values ('<stream of 4682819 bytes>', '1016') was aborted. Call getNextException to see the cause.
I have the suspicion that the new jdbc postgres driver does not allow streaming access to oid columns any more, and instead requires some
org.postgresql.largeobject.LargeObject magic to work. Or maybe I'm just doing it wrong?
Any help would be greatly welcome,
Arne