-->
These old forums are deprecated now and set to read-only. We are waiting for you on our new forums!
More modern, Discourse-based and with GitHub/Google/Twitter authentication built-in.

All times are UTC - 5 hours [ DST ]



Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 3 posts ] 
Author Message
 Post subject: Session cache (1st level) tuning
PostPosted: Thu Jun 05, 2008 5:02 am 
Newbie

Joined: Thu Jun 05, 2008 4:56 am
Posts: 2
Hello,

I want to configure the session level cache is it possible to do it in hibernate config file ?

What I want to do is to reduce the number of entities stored in this cache because my batch make a lot of read accesses (SELECT) in the same session and after a long time, the cache is full and I must clear it explicitly (Session.clear()) to recover acceptable performances.

Any ideas ?

Ektor.


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jun 05, 2008 11:45 am 
Expert
Expert

Joined: Tue May 13, 2008 3:42 pm
Posts: 919
Location: Toronto & Ajax Ontario www.hibernatemadeeasy.com
Are you using executeUpdate or executeDelete for the batches? What method are you using on the session to do these batch processes? How many records are we talking.

Sometimes ETL tools are a better batch solution that Hibernate.

_________________
Cameron McKenzie - Author of "Hibernate Made Easy" and "What is WebSphere?"
http://www.TheBookOnHibernate.com Check out my 'easy to follow' Hibernate & JPA Tutorials


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jun 05, 2008 12:11 pm 
Newbie

Joined: Thu Jun 05, 2008 4:56 am
Posts: 2
Cameron McKenzie wrote:
Are you using executeUpdate or executeDelete for the batches? What method are you using on the session to do these batch processes? How many records are we talking.

Sometimes ETL tools are a better batch solution that Hibernate.


Hello Cameron,

My batch opens a session and iterates over a collection of 10,000 elements to generate an Excel report. Each object is linked with some others, but the graph isn't very deep (I don't have memory issues).
I don't update the database, I just retrieve object values (all these elements are retrieved at the very beginning of my batch process).
There is no problem with less than 1000 entities but over this value, the performence decreases by threshold (1400, 1900, 2300, ...) in an exponential way.
My analysis is that the performance loss comes from the implementation of the org.hibernate.engine.StatefulPersistenceContext class (and a bad use of it ...):
- The huge number of objects stored in the Maps of this class reduces the access speed when checking if an object isn't already in cache.
- When the initial capacity of these collections is reached, they must be resized, the cost of this operation is expensive when there is a numerous objects inside.

Then, the solution is to clear this cache explicitly all with clear() method or more accurately using session.evict().

Last detail, I know, it isn't really clean, but the batch doesn't work with the POJOs but with DTO, this is why I can use session.clean() without any risks : all the data has been transfered from POPJOs to DTOs.

Ektor.


Top
 Profile  
 
Display posts from previous:  Sort by  
Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 3 posts ] 

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
© Copyright 2014, Red Hat Inc. All rights reserved. JBoss and Hibernate are registered trademarks and servicemarks of Red Hat, Inc.