Hibernate version: 2.1
I discovered some strange behavior in the way that Hibernate remembers which objects don’t exist in the current session, related to joined-sub-classes. I’d like to hear if it is in fact a bug or if the behavior is by design.
Suppose you had a type hierarchy where B extends A and C extends A. Further suppose there was a persistant instance of B with an ID of 1234. If you try to get C first, using 1234 and you (rightly) failed to load the object, hibernate would remember that that object doesn’t exist for the entire A type hierarchy, not just for C. So you if then later tried to load B in the same session, it would report that the object doesn’t exist. Here is an example in code:
Code:
Session s = ….
C c = s.get(C.class, new Long(1234)); //c will correctly be null
B b = s.get(B.class, new Long(1234)); //b will incorrectly return as null as since hibernate has remembered that 1234 doesn’t exist for the entire A hierarchy.
Now one workaround for this is to always load the object as the uppermost persistent object in the type hierarchy if you are unsure if the object is in fact that type, but this leads to inefficient reading and potentially unnecessary creation of objects.
Is this behavior by design?