Quote:
hibernate.jdbc.use_streams_for_binary=true
hibernate.jdbc.batch_size=0
this does not work for me: the whole data in the CLOB column gets deleted if I try to insert a text of more than 4000 char. I get no error message from hibernate. I am using Oracle 8i, this is the mapping:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD 2.0//EN" "http://hibernate.sourceforge.net/hibernate-mapping-2.0.dtd">
<hibernate-mapping>
<class name="bacrule1.Organism" table="organism">
<id column="oscode" name="oscode" type="string" unsaved-value="null">
<generator class="assigned"/>
</id>
<property column="description" name="description" type="text"/>
<property column="internal" name="internal" type="string"/>
</class>
</hibernate-mapping>
anybody found a workaround?
what about this batch-size attribute ?