-->
These old forums are deprecated now and set to read-only. We are waiting for you on our new forums!
More modern, Discourse-based and with GitHub/Google/Twitter authentication built-in.

All times are UTC - 5 hours [ DST ]



Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 7 posts ] 
Author Message
 Post subject: Out of memory error
PostPosted: Thu Jul 01, 2004 3:35 am 
Newbie

Joined: Mon Feb 09, 2004 6:45 pm
Posts: 10
Location: Feltre, BL Italy
Hi all.
After days of unsuccesfull tring and ng readings, I eventually
have to ask fo help.

my application is stand alone, SWT/Jface based gui, spring then hibernate then the DB (mysql). No web, no tomcat or whatever. For the purposes of what follows no gui is even needed (console based fits). The object model was modeled using poseidon and classes , factories and hbm.xml files were built using AndroMDA. Object model counts 20 classes, 20 relationships
Everithing works well for the normal program use (users read items, change then save update, add relationships, ...)

The new task is to convert a legacy database (DBF files) to the db.
what I do is: in a business method (attributed as "transactionAttributeSource" and "PROPAGATION_REQUIRED" in the applicationContext.xml file)
I read 1000 item records from the DB, incarnate the item objects via
the andromda-generated ItemFactory, set the attributes for relationships
and the save the item object. other side of relationships are read via queries from the db, and cached in Hashmaps (just a detail)
After processing this way 1000 records, spring commits the transaction to the db and gets back control to my application; this business method is then called back again until all DBF records are processed (in a brand new transaction, in a brand new session).

Unfortunately, after processing as much as 20.000 items (all db is about 80.000), I got an "out of memory error" (java process gets about 85 megs memory)

I read in the ng that Hibernate does not fits for "massive update/delete"
but what I ask is to process just a bounch of items at a time (now 1000, but could be 100 or 10). Another fact I read in the ng that could raise the memory problem is "hold the db in memory": this does not applies to my case, since what I strive to do I to process n (1000) items at a time, no more, and then process other 1000, and so on.
I really do not know where I'm wrong.
Should I do it by direct sql coding? I hope not to do it, because I like hibernate because it avoids me to deal with uuid primary keys, low level relationships management, hand written (error prone) sql statements and so on.

Sorry for the long posting, but I'm stuck (and desperate). I know this topic is boring for the many top-notch programmers writing hibernate and applications but hope they can point me to what I'm wrong of show
me some piece of code tu study/adapt

Thanks in advance, best regards
davide

Hibernate 2.1
Spring 1.0.2
eclipse 3.0m9
windows 2k
I can send *.hbm.xml files If needed (but the are trivial: all realtionships are cascade="none", outer-join="false", update="true", insert="true"; on the other side: lazy="true", inverse="false", cascade="none", sort="unsorted", i.e. default andromda generation style)

error is:
2004-07-01 01:52:23,421 INFO org.springframework.transaction.interceptor.TransactionInterceptor - Invoking rollback for transaction on method 'importItemDBF' in class [item.ItemBusinessLogic] due to throwable [java.lang.OutOfMemoryError]


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jul 01, 2004 6:04 am 
CGLIB Developer
CGLIB Developer

Joined: Thu Aug 28, 2003 1:44 pm
Posts: 1217
Location: Vilnius, Lithuania
use PLAIN JDBC for imports or better command line utility:
dbf2csv | awk -f csv2sql | dbclient. sqlloader type utility can help too.


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jul 01, 2004 12:13 pm 
Newbie

Joined: Mon Feb 09, 2004 6:45 pm
Posts: 10
Location: Feltre, BL Italy
thanks baliukas for considering my question.
I can write the SQL, but I do not know how to generate
the uuid primary keys in order to build table that are later usable by hibernate.
Any suggestion? Any other user experimentig these problems?

davide


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jul 01, 2004 1:08 pm 
CGLIB Developer
CGLIB Developer

Joined: Thu Aug 28, 2003 1:44 pm
Posts: 1217
Location: Vilnius, Lithuania
Use sequence, hibernate can use it too.


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jul 01, 2004 3:54 pm 
Beginner
Beginner

Joined: Mon Jun 07, 2004 4:21 pm
Posts: 44
Location: Boston
Just in case you haven't seen this:

http://forum.hibernate.org/viewtopic.php?t=930835


Top
 Profile  
 
 Post subject:
PostPosted: Fri Jul 02, 2004 2:24 am 
Newbie

Joined: Mon Feb 09, 2004 6:45 pm
Posts: 10
Location: Feltre, BL Italy
Hi guys. Yes I saw the link you provided. Things get a little better, I can read as much as 50.000 records, then the "out of memory" rises again.
So this is not the solution since my legacy DB is 80.000+ records big.
Changing the prim keys to sequences (thanks baliukas) is possible,
but consider this:
I moved to use androMDA+hibernate in order to generate DDL and hbm.xml files automatically and not to get worried about
SQL hand writing (DDL/DML), prim keys hassless and so on; you declare it in the UML model and you got it. I kicked these out of the door, and they are back in from the window.

But what is disturbing me is there's no way I can understand what's happening: why the hell once I close the session and the transaction (well, Spring does these for me) I still have all that garbage all around?
It is weel possible that the user uses the procedure to process let's say, 200 items at a time (procedure maybe implemented by the Query or Criteria interfaces), then repeat it again and again and again, getting out of memory at the end of the day!
Where I'm wrong?

thanks you guys
davide


Top
 Profile  
 
 Post subject:
PostPosted: Fri Jul 02, 2004 4:35 am 
CGLIB Developer
CGLIB Developer

Joined: Thu Aug 28, 2003 1:44 pm
Posts: 1217
Location: Vilnius, Lithuania
It must be some problem in your code, hiobernate is used for web in most applications and any server application will fail with memory leak after query, I do not believe it can be hibernate problem

You can start jvm with -verbose:gc parameter to monitor garbage collection and use "small" heap for test (as small as possible to start application ) I think it must be possible to test stand alone application with -Xmx64M

And do not use O/R to import files, try to do it without any programming first. it is better to automate import in script, you can execute it from command line or from application or put icon on desktop, it will be possible to use the same script as scheduled task too.


Top
 Profile  
 
Display posts from previous:  Sort by  
Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 7 posts ] 

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
© Copyright 2014, Red Hat Inc. All rights reserved. JBoss and Hibernate are registered trademarks and servicemarks of Red Hat, Inc.