Yes, flush() and clear() are both called. Batch size is set. 2nd level cache is off. Performance is pretty bad. This is the old "N INSERT statements for one-to-many associations problem". I can't find a way around this even though apparently a bug fix is in.
http://opensource.atlassian.com/projects/hibernate/browse/HHH-1
For example if you have this:
Code:
@Entity
public class Parent {
@OneToMany(mappedBy="parent", cascade={CascadeType.PERSIST})
public Collection<Child> children = new ArrayList<Child>();
}
And you do something like this:
Code:
Parent parent = new Parent();
Collection<Child> children = new ArrayList<Child>();
for(int x=0; x<1000; x++) {
children.add(new Child());
}
parent.setChildren(children);
EntityTransaction tx = entityManager.beginTransaction();
entityManager.persist(parent);
tx.flush();
tx.clear();
tx.commit();
The SQL executes a single
INSERT INTO CHILD ...etc...
1000 times. I want it to do a bulk insert.
I'm thinking the best way out of this is to make a stored procedure and have hibernate execute that...anyone have any better ideas?