fatal2 wrote:
Hi fregas,
I'm not sure if it does have this ability but i would certainly welcome it. It would definitely be useful to be able to stream objects through memory so that you can process large batches of objects without having to load them all into memory first.
cheers,
Paul.
Honestly, I think you guys may be on the wrong track. If your goal is performance, I think you may run in to some big issues with the given approach. Making 20,000 database calls to load one object at a time will certainly hurt performance.
First I would say that 20,000 records really isn't that many. If you need to load all 20,000, lets me pessimistic and say that your objects are 1K in size, you are talking about 20 megs worth of data. This should not be a problem for most systems. Your access to the memory would be far easier than going to the database for every object. The performance should be far faster loading 20,000 in to memory.
One pitfall to this argument is that NHibernate does not seem to scale linearly when it comes to flushing dirty objects. Dirty checks seem to take longer than a linear time scale.
That being said the Enumerable has the same issue. The objects are not going out of memory when you loop to the next one. The object will still be in scope of the session, which means you would still have a reference to the object, not allowing garbage collection to free the memory and still requires the dirty check on flush. You wouldn't be saving any memory anyway. So in both cases when you are finished with objects you would need to evict them from the session to save on dirty checking and memory cost.
Also, take a look at some other searching functions in NHibernate such as SetFirstResult() and SetMaxResults() which let you specify which range of the query result you want returned. This would let you run fewer queries but still batch your inputs. There are some things to watch out for, but this would perform far better than the enumerable method.