-->
These old forums are deprecated now and set to read-only. We are waiting for you on our new forums!
More modern, Discourse-based and with GitHub/Google/Twitter authentication built-in.

All times are UTC - 5 hours [ DST ]



Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 4 posts ] 
Author Message
 Post subject: how does hibernate handle large inserts?
PostPosted: Wed Aug 31, 2005 8:22 pm 
Newbie

Joined: Wed Aug 31, 2005 8:06 pm
Posts: 5
Hi,

This may seem like a dumb question, I'm fairly new to this stuff.

We are designing a database and will be using Hibernate to manage persistance. At certain points within the application, we will be persisting an object model to the db. This will consist of a three level parent/child model and the number of records would be around 500. It has been suggested to us that persisting 500 new records in one transaction is risky from a database perspective and it would be better to batch the data into smaller bundles. To do this we would need to have an extra column on the database to allow rollback. ie while inserting the rows, each row would have a certain state (eg created) and then when all rows where persisted they would all be updated to indicate they have all been saved (eg with a state of saved). So if something went wrong during persistance, all rows with the 'created' state would be deleted which would be equivalent to a rollback if this were a single transaction.

My questions are:

1) Is this necessary or is the single transation fine?
2) Does Hibernate do any of this type of management for us? ie Does it split up large inserts into batches? If it does or can how does it handle rollback?
3) Does anyone have any suggestions how to best handle this?

Thanks. Any help is much appreciated.

Regards.
Doyley.


Top
 Profile  
 
 Post subject:
PostPosted: Wed Aug 31, 2005 9:15 pm 
Hibernate Team
Hibernate Team

Joined: Tue Aug 26, 2003 12:50 pm
Posts: 5130
Location: Melbourne, Australia
Quote:
It has been suggested to us that persisting 500 new records in one transaction is risky from a database perspective and it would be better to batch the data into smaller bundles.


Who has suggested this and what is their rationale?

On the face of it, it sounds completely nutty.

You realize that SQL databases are regularly used to process millions of records, right?


Top
 Profile  
 
 Post subject:
PostPosted: Wed Aug 31, 2005 9:22 pm 
Newbie

Joined: Wed Aug 31, 2005 8:06 pm
Posts: 5
It was suggested by a consultant, shouldn't say where from. There rationalle is that persisting that many records in one commit can be risky from a db perspective, not sure why.

I agree it doesn't seem that big an issue, I'm sure I've done similar things in the past without any issue. Be aware that it is 500 inserts in one transaction, so I guess that is quite a few but I wouldn't think it would introduce too much risk for the db. It's an Oracle db.

So are you suggesting that breaking it up into smaller batches is a waste of time?


Top
 Profile  
 
 Post subject:
PostPosted: Wed Aug 31, 2005 9:29 pm 
Hibernate Team
Hibernate Team

Joined: Tue Aug 26, 2003 12:50 pm
Posts: 5130
Location: Melbourne, Australia
I'm suggesting that you should probably consider sacking your consultant.

(Unless there is some other key fact you are not telling us, like, each insert requires 60seconds of other processing or something.)


Top
 Profile  
 
Display posts from previous:  Sort by  
Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 4 posts ] 

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
© Copyright 2014, Red Hat Inc. All rights reserved. JBoss and Hibernate are registered trademarks and servicemarks of Red Hat, Inc.