Read the rules before posting!
http://www.hibernate.org/ForumMailingli ... AskForHelp
Hibernate version:2.1.8 and 3.0 release
Mapping documents:
Code between sessionFactory.openSession() and session.close():
Transaction tx = session.beginTransaction();
for(int i=0;i<256;i++)
{
BlobTest blobTest = new BlobTest();
// let's say that baos is a byte array of 500k of data, with a new array of 500k of data every loop iteration
// also the 256 that it iterates to, is actually variable in the real code
blobTest.setBlob(Hibernate.createBlob(baos.toByteArray()));
session.save(blobTest);
sesiion.evict(blobTest);
if(i%20==0)
{
// batch solution as per Gavin's blog
session.flush();
session.clear();
}
}
tx.commit();
Full stack trace of any exception that occurs:
java.lang.OutOfMemoryException
Name and version of the database you are using:
MSSQL 2k,using JTDS 1.0.2
It doesn't seem to flush out the instances of blobTest or possibly the reference to the Blob. I implemented this in an alternative way by just asking Hibernate for a connection, and feeding the DB the binary stream via PreparedStatement instead, and that does not have the memory leak.
Increasing the java heap can help this case but if the looping of the blobs exceeds/nears the heap limit it will also go Out of memory. This cannot be the solution since it can only work for one user at a time.
Is there a different workaround? Like RTFM (pointer to section please). Or the DIY approach via connection is the Acceptable Solution (tm)?
Thanks for suggestions, criticisms, comments, flames, whatever :)