Hello,
I was wondering whether Hibernate can be used for a scenario that involves reading huge number of records from 2 tables, comparing 2 type of data (from each table) based on some business rules (like table1.column2 = table2.column5, table1.column3 between (table2.column6, table2.column10) and categorized (matched, unmatched) based on them and write result back to database. Just to give you an idea, the requirement is to process over 75 million records per day - there are some 30+ columns in each table and data type is mostly numeric (int).
Comparasion operation could be done much easily (compared to using JDBC) if we have data as objects (Hibernate), however, it doesnt involve updation of existing data - transaction might be an overhead. Which would be more efficient - reading 100k records in one shot (query.list()) or reading records in batches of 1k (the server is high-end machine, database on local lan, bandwidth not a problem). How efficiently hibernate could be used in this scenario along with J2SE5.0's java.util.concurrency, say, readying 2 types of data via 2 threads running in parallel and using Hashmap (concurrent) to keep read data as objects.
What do you think?
Thanks,
katyan.
|