Well I still don't get a clear idea of what you are trying to do but, let see...
To work on masive updates/insertions/ect. older hibernate version sugest to use plain odbc. I think this the sugestion still holds for newer versions.
Any way, from your explanation, I can not see how session cache may harm you. You could clear the session cache at the end of every transaction but I don't think your situation needs it. (take a look on session.evict(object) and CacheMode)
More over, if you are retrieving record one by one, you should activate the batch-size setings to let hiberntate manage that, and you should try to work on subsets, 20, 50, 100 or more records at time to let hibernate optimize querying.
Here you have an example of batch-size use:
http://blogs.warwick.ac.uk/kieranshaw/entry/garbage_collection_and/
You should look at the doc for other tags related to batch[ing]
If you are worried of filling your cache with every one of the 300.000 records you are gonna work on, you can evict them from the session cache and let the GC do it's work or fix the cache size to let the olders instances fell out.
I hope this helps.
Emilio