Greetings!
i posted this previously in the newbie forum but thought it would be more appropriate here.
i am using Hibernate 2.1.2 and Oracle9i.
i have a program that creates and inserts a large collection of objects during a single transaction. Each persisted object includes a bi-directional relationship with another object that is also persisted. This relationship breaks hibernate's ability to perform batched insertions. For example:
Classes:
Code:
public class Foo extends Persistent
{
private Bar bar;
public Bar getBar() { return bar; }
public void setBar(Bar bar) { this.bar = bar; }
}
public class Bar extends Persistent
{
private Foo foo;
public Foo getFoo() { return foo; }
public void setFoo(Foo foo) { this.foo = foo; }
}
Hibernate Config:Code:
<hibernate-mapping>
<class name="Foo" table="Foo">
<id name="id" column="id">
<generator class="myUUIDGenerator"/>
</id>
...
<many-to-one
name="bar"class="Bar"
cascade="all"
column="bar"
/>
</class>
<class name="Bar" table="Bar">
<id name="id" column="id">
<generator class="myUUIDGenerator"/>
</id>
...
<many-to-one
name="foo"class="Foo"
cascade="all"
column="foo"
/>
</class>
</hibernate-mapping>
Client (pseudocode):Code:
public class Client
{
public static void main(String[] args)
{
Collection objects = new ArrayList();
for (int i = 0; i < 5000; i++)
{
Foo foo = new Foo();
Bar bar = new Bar();
foo.setBar(bar);
bar.setFoo(foo);
objects.add(foo);
}
myService.create(objects);
}
}
Service (pseudocode):Code:
public class MyService()
{
public void create(Collection objects)
{
// start hibernate transaction
for (Iterator iter = objects.iterator(); iter.hasNext(); )
{
hibernateSession.saveOrUpdate(iter.next());
}
// commit hibernate transaction
}
}
When i execute this code Hibernate (bless it) ignores my batch size. This is because the save/update cascade setting causes two different INSERT commands to be issued for each iteration in the loop (INSERT INTO FOO and INSERT INTO BAR):
Hibernate: BatcherImpl::prepareBatchStatement()Code:
public PreparedStatement prepareBatchStatement(String sql) throws SQLException, HibernateException {
if ( !sql.equals(batchUpdateSQL) ) {
batchUpdate=prepareStatement(sql); // calls executeBatch()
batchUpdateSQL=sql;
}
return batchUpdate;
}
the (perceived?) inability to use batching leads to an order of magnitude degredation in performance when a network is involved.
i can force Hibernate to use batching if i change my cascade settings to 'delete' instead of 'all' and if i make the following modification to my service class:
Code:
public class MyService()
{
public void create(Collection objects)
{
// start hibernate transaction
for (Iterator iter = objects.iterator(); iter.hasNext(); )
{
Foo foo = (Foo)iter.next();
hibernateSession.saveOrUpdate(foo);
}
for (Iterator iter = objects.iterator(); iter.hasNext(); )
{
Foo foo = (Foo)iter.next();
Bar bar = foo.getBar();
hibernateSession.saveOrUpdate(bar);
}
// commit hibernate transaction
}
}
Execution time for my test class drops from 45 seconds to 3 seconds.
Now, would it be possible (if it has not already been done) to optimize Hibernate for the bulk insertion of related/joined pojos/tables? for example, the Hibernate Session object could be extended to provide a
saveOrUpdate(Collection objects) method. The implementation of this method could persist the objects type by type in a single transaction (i.e. persist all of the Foo's then persist all of the Bar's). This would allow the use of batching and would substantially improve performance.
I am a newbie to Hibernate and it may offer such a feature already. if so, i apologize.
Brad
p.s. i think the Hibernate product is outstanding.