Regarding the 2nd level cache -- we tried it, but it didn't seem to help, because nearly all of our data retrieval is by HQL and not by loading individual entities by ID.
Anyway, our main issue is the conflict where you want to reattach a detached object, and a different instance with the same ID is already in the session. We can't seem to completely avoid that, and to handle it in the general case through our own Attach() front-end method, it has to take the entity to attach by reference (since if there's already one in the session with the same ID, the caller is forced to switch all their references to that other instance). Then to pass it by reference, it has to be of type object, which means the code calling Attach() has to put it into a variable of type object first. So, reattaching just one object ends up being a mess like this:
Code:
object entity = typedVarWithRealEntity;
EntitySession.Attach(ref entity);
if (!object.ReferenceEquals(entity, typedVarWithRealEntity)
{
typedVarWithRealEntity = entity;
// reset additional references to entity ...
}
// Now after all that crap we can finally touch a property
// without fearing "LazyInitializationException: No session"
As you can see, in methods where I have to drill 5 levels deep through entity/collection properties, when every freakin' entity that I need to touch has to be reattached in this way, 80% of my code then becomes this clunky plumbing to reattach. What should be 10 or 20 lines of clear code becomes at least 5 times that amount of confusing code ...
The only thing I can think of to avoid it is if we write our own cache provider which raises an event that a cached object is requested, and all services that need to keep detached objects have to listen to the event and give the cache provider its detached copy of the object if it has it. Then fall back on one of the standard cache providers if it wasn't found in our own "cache" of detached objects ...