Hi,
I have written an embedded EJB application which extracts data from a series of CSV files and writes them to a MySQL 4.0.27 database.
There is ~ 25000 records that need to be imported into the database (across a number of different CSV files). I want to write this data in a single transaction so that if there is any problems processing any of the files, I can just stop and none of the data be committed to the database.
The problem that I am having is effectively a memory leak. As I understand it wrapping the insertion of the data in a transaction like I am is causing the objects to be stored in first-level cache before being comitted to the database. Since I am processing a large number of files this is resulting in an Out of memory exception before I have processed all of the data - although I say large I don'e see this number of records as extraordinarily large, just large enough to cause me problems :). I have come across ways of configuring the second level cache some of which cache the data to disk. Is it possible to do something similar for first-level cache? The closest I have got is articles such as
http://www.informit.com/articles/article.asp?p=353736&seqNum=5&rl=1 which says something along the lines of "Nobody ever configures first-level cache". Or maybe there is some other way round my problem.
Thanks.
Simon