I'm not really sure if this behavior is a problem or not, but it does seem questionable to me. I've checked forums and docs, but have not found an adequate explanation as to why it is behaving this way. So, if anyone can shed some light - I would greatly appreciate it.
class Trust {
public Set getTrustees();
public void setTrustees(Set trustees);
}
Assuming the above class, doing the following (psuedo-code) fails:
Trust trust = Hibernate.find(key);
trust.setTrustees(new Set());
Hibernate.save(trust);
Trust found = Hibernate.find(key);
assertTrue(trust.getTrustees().size()==found.getTrustees().size());
However, this works:
Trust trust = Hibernate.find(key);
trust.getTrustees().clear();
Hibernate.save(trust);
Trust found = Hibernate.find(key);
assertTrue(trust.getTrustees().size()==found.getTrustees().size());
I understand that the collections in objects retrieved from Hibernate specialized Hibernate collections, but this still seems faulty to me, as the first case is a valid representation of the object's state we want saved, but it seems to be ignoring it.
If anyone can provide an explanation as to why this is happening - whether it be configuration on my part, architectural decision, or an actual defect, I would greatly appreciate it.
btw, I believe this is still Hibernate 2.1.16.
Thanks
|