colins wrote:
Yvan wrote:
Hi,
Our applications use hibernate to give an abstraction of the database (Oracle, informix, mysql,..). We are using CLOB and BLOB. With Oracle we have problem when the size of a LOB is > 4k. I know it exists a solution String to CLob and Byte[] to BLOB types mapping provided on Hibernate website. The problem is that we really have big CLOB and BLOB and the solution to read them into memory is not acceptable; we need to read or pass them as Inputstreams.
How can I solve my problem ?
Thanks for your answer.
Yvan
You could use some helper JDBC code which actually does the reading/writing for you. You can get the actual connection from your Hibernate Session, so it would share in any existing Hibernate transaction, etc.
To clarify my post a bit, it's quite possible to use Hibernate's built-in BLOB/CLOB support with Oracle, as per the wiki pages (as long as you have Oracle specific code). The problem that you have then though is that using the disconnected functionality in Hibernate to ship something out to the client layer and then come back and then update into another session with saveOrUpdate, is not possible for the BLOB/CLOB fields. So since we're using a Mapper/DAO style approach with Spring and Hibernate, we've found it easier to not even include the LOB fields in the Hibernate object, and use some supporting JDBC code in the mapper object to read/write the LOBs. Since this is in a base class used by all the mappers, this approach is simple.
Personally, what I think would make LOB usage with Hibernate much more useful, is if Hibernate supported lazy loading of certain fields. Then, the LOB fields could be set to a custom user type that read/wrote the LOB as a string or byte array, transparently, but you would not actually trigger the read unless you really needed the field.