Hello everyone,
I have this simple program which basically outputs data from a db (oracle 9) to a file. The db table is quite big (around 800k rows) and I get an OutOfMemoryError.
Below is the sample code that triggers the error; the error does not happen if I uncomment the line "emO.clear()".
It seems that the persistent objects don't get garbage collected without explicitly calling clear() on the EntityManager at every 1000 (or so) iterations.
This seems a little strange to me, because I would argue that the MerceologiaArticoloSic objects should be garbage collected when the flow exits the while loop.
Am I wrong on this? If yes, why?
Thanks in advance to everyone,
Riccardo
Hibernate version: 3.2 GA
Mapping documents: using Hibernate Annotations
Code between sessionFactory.openSession() and session.close():
EntityManager emO = emfOracle.createEntityManager(); PrintWriter out = new PrintWriter(new FileWriter("D:/temp/out.log")); int maxResult = 1000; int startResult = 0; Query queryMerceologia = emO.createQuery("FROM MerceologiaArticoloSic"); queryMerceologia.setMaxResults(maxResult); boolean over = false; try { while (!over) { queryMerceologia.setFirstResult(startResult); List<MerceologiaArticoloSic> res = queryMerceologia.getResultList(); for (MerceologiaArticoloSic merc : res) { out.println(merc.toString()); } over = (res.size() == 0); startResult += maxResult; log.info("inseriti "+startResult); //emO.clear(); } } catch (Exception e) { log.error("Errore: " + e.getMessage(), e); throw e; } finally { emO.close(); out.close(); }
Oracle 9
|