sball wrote:
Most likely you'd overflow the heap size of your JVM with this many rows all being cached simultaneously. You'd probably also want to create an ehcache.xml configuration specifying how many you want to keep alive at a time, or if you want to overflow to disk.
Thanks for the reply,Pal.
Yes, my jvm will crash if I load the entire resultset into memory. I am trying to figure out, whether I can load the results into a second level cache and use their disk caching feature. From the documents, I could manage to load the entities into query cache. But, even if I specify a CacheProvider for the hibernate.cache.provider_class property, I could see that entities are not being cached. Any pointers on this ?