hi ,
i have a db xml , each folder holds approx 12K files each file holds around 50 entities (each item is an appearance of a class) that translates to approx 0.5m insert rows per folder and i'm getting java heap size errors one some of the folders , i've trimmed my code everywhere and quite frankly it's a very simple code using jdom .
Code:
try{
if ((session!=null)&&(session.isOpen()))session.close();
session = openSession();
transaction = session.beginTransaction();
transaction.begin();
File file = new File(path);
if (file.isDirectory()) {
files = file.listFiles();
for (i = 0; i < files.length; i++) {
saveToEntity(files[i] + "");
}
}
transaction.commit();
session.close();
HibernateUtil.closeResources(session, transaction);
}
catch (Exception e) {
System.out.println(files[i]);
e.printStackTrace();
}
savetoentity does a session.save(stock) per each entity
how can this be solved ?
savetoentity should realy be called savetoentities (each file holds many entitties)