Hi all,
I have a file (csv) having 216680 lines. I use this information and load the information into the Database after a bit of processing the data.
I am not loading the whole file into memory. I read record-by-record.
After processing about 100000 records, I see a lot of decrease in the perfomance of my program. I am seeing this by printing out the no. of records being processed every 5secs.
Before 100000(approx) records, i see 480 (approx) records being processed every 5secs. After 100000 records, only 1 record is being processed every 5secs.
I have seen the documentation, and used
Session.evict(Object); (After Session.save(Object);)
I also disabled caching. I used a profiler and found out that most of the time is taken by the Session.flush();. I dunno what to do now.
I am using Hibernate 2.1.2.
Any suggesstions pls.
-Vid.
|