Hi All,
I am facing something that I suspect to be a bug. I have mapped a collection as following:
Code:
@CollectionOfElements(fetch = FetchType.EAGER)
@Cascade(value = CascadeType.ALL)
@JoinTable(name = "person_address", joinColumns = @JoinColumn(name = "person_id", nullable = false))
@SequenceGenerator(name = "MySequence", sequenceName = "person_address_seq", allocationSize = 1)
@CollectionId(columns = @Column(name = "collection_id"), type = @org.hibernate.annotations.Type(type = "long"), generator = "MySequence")
private Collection<Address> addresses = new ArrayList<Address>();
In my queries, I returns duplicate
adresses . It seems that for each row returns in the resultset, the address is added even if it has the same collectionId of an address already added.
If we look at the source code we can see that indead the element read are added to the field
persistentIdentifierBag.values without taking care of the duplicates.
Code:
public class PersistentIdentifierBag {
protected List values; //element
protected Map identifiers; //index -> id
...
public Object readFrom(
ResultSet rs,
CollectionPersister persister,
CollectionAliases descriptor,
Object owner)
throws HibernateException, SQLException {
Object element = persister.readElement( rs, owner, descriptor.getSuffixedElementAliases(), getSession() );
Object old = identifiers.put(
new Integer( values.size() ),
persister.readIdentifier( rs, descriptor.getSuffixedIdentifierAlias(), getSession() )
);
if ( old==null ) values.add(element); //maintain correct duplication if loaded in a cartesian product
return element;
}
}
I am wondering if it is normal that elements with the same collectionId can be contained in an persistentIdentifierBag ?
It will be grateful if anybody could enlight me ?
After reflexion, I am getting really the impression that it is a bug because if we consider that as normal behavior, it means that it should have the same semantic as bags. Therefore we should have the same limitations. I mean we should not be able to fetch simultaneously multiple bags in a query (what i am doing) -> an exception must be thrown.
Thx,
Tiggy