Hi,
I just finish a project doing just that, 50 tables and 5 gig of data each day dispatched on 2 different systems. Crash proof, generic and configurable (Spring), Thread pool configurable...
First, I dont think Hibernate is design to do this kind of data transfert. It doesn't mean you cannot build your persistence layer with it for your apps, but you may need something else for your data transfert, it will be too slow. This kind of process you are talking is named ETL : Extract Transform and Load.
http://www.computerworld.com/databasetopics/businessintelligence/datawarehouse/story/0,10801,89534,00.html
The JDBC way with batch insert is probably a ok way to do this. If your jdbc driver/database is compatible with batch update. For the jdbc and transaction handling, I would use the Spring JdbcTemplate (+RowCallbackHandler) to load your data.
But for 10M records, you will need speed.
If you are using Sybase, for more speed, you can use the BCP-IN loader to upload your data. This is what we choose. Check if you need to drop the index before insering if you need more speed.
You seems to say that yours apps cannot read the table while loading... do you plan to transfert all data each time? then you need to drop connection, drop index, drop content of table, load, create index... reconnect. You also have to drop/create the foreign keys... If you load partialy the data, you need to have a strategy for loading only the part that you need (with SQL). Also, dont forget to validate your data before opening your connection, using some kind of sql query...
We used 2 databases and the Hibernate layer knows when to get its connection from data-sourceA and data-sourceB using a control data-base/table.
Also, your app need to be crash proof. If your loading fail (and it will) you will probably want to restart the loading where it crash... You will probably need to extract and load in a multi-thread way... if you have many table to load...
Regards,
Etienne.
Good luck.