-->
These old forums are deprecated now and set to read-only. We are waiting for you on our new forums!
More modern, Discourse-based and with GitHub/Google/Twitter authentication built-in.

All times are UTC - 5 hours [ DST ]



Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 4 posts ] 
Author Message
 Post subject: collection problem --- please help!
PostPosted: Thu Nov 04, 2004 9:27 pm 
Beginner
Beginner

Joined: Fri Feb 27, 2004 3:49 am
Posts: 25
i got problem to persist parent and child. below is my mapping file and error message i got.

Mapping documents:

<hibernate-mapping>
<class name="scratch.Child" table="child" dynamic-update="false"
dynamic-insert="false" >

<id name="id" column="ID" type="long" unsaved-value="0">
<generator class="native">
</generator>
</id>

<version name="version" type="integer" column="VERSION" access="property" unsaved-value="null" />

<many-to-one name="receivePack" class="scratch.Parent" cascade="none" outer-join="auto" update="true" insert="true" access="property" column="RECEIVE_PACK" not-null="true" />
</class>
</hibernate-mapping>

<hibernate-mapping>
<class name="scratch.Parent" table="parent" dynamic-update="false"
dynamic-insert="false" >
<id name="id" column="id" type="java.lang.Integer">
<generator class="assigned"></generator>
</id>

<version name="version" type="integer" column="version" access="property" unsaved-value="null"/>

<list name="packItems" lazy="false" inverse="true"
cascade="all-delete-orphan">

<key column="RECEIVE_PACK">
</key>

<index column="NO" type="integer"/>

<one-to-many class="scratch.Child" />
</list>
</class>
</hibernate-mapping>

Session Code
ses = ThreadLocalSession.currentSession();
ses.save(parent);
ses.flush();

Error Message:
2004-11-05 09:25:42,946 DEBUG [net.sf.hibernate.impl.SessionImpl] saving [scratch.Parent#04-114]
2004-11-05 09:25:42,946 DEBUG [net.sf.hibernate.engine.Cascades] processing cascades for: scratch.Parent
2004-11-05 09:25:42,962 DEBUG [net.sf.hibernate.engine.Cascades] done processing cascades for: scratch.Parent
2004-11-05 09:25:42,962 DEBUG [net.sf.hibernate.engine.Cascades] processing cascades for: scratch.Parent
2004-11-05 09:25:42,962 DEBUG [net.sf.hibernate.engine.Cascades] cascading to collection: scratch.Child
2004-11-05 09:25:42,962 DEBUG [net.sf.hibernate.engine.Cascades] cascading to saveOrUpdate()
2004-11-05 09:25:42,962 DEBUG [net.sf.hibernate.engine.Cascades] version unsaved-value strategy NULL
2004-11-05 09:25:42,962 DEBUG [net.sf.hibernate.impl.SessionImpl] saveOrUpdate() unsaved instance
2004-11-05 09:25:42,962 DEBUG [net.sf.hibernate.impl.SessionImpl] saving [scratch.Child#<null>]
2004-11-05 09:25:42,962 DEBUG [net.sf.hibernate.engine.Versioning] Seeding: 0
2004-11-05 09:25:42,962 DEBUG [net.sf.hibernate.impl.SessionImpl] closing session


Top
 Profile  
 
 Post subject:
PostPosted: Sat Nov 06, 2004 5:24 pm 
Newbie

Joined: Thu Oct 21, 2004 3:02 pm
Posts: 2
with list references you need to specify an index !!!
This means you need to have an extra field which specifies
the order of your items in the list reference.

<list name="packItems" lazy="false" inverse="true"
cascade="all-delete-orphan">
<index ...>
<key column="RECEIVE_PACK">
</key>


Try to use a set mapping instead of a list mapping.


Top
 Profile  
 
 Post subject:
PostPosted: Sat Nov 06, 2004 5:25 pm 
Newbie

Joined: Thu Oct 21, 2004 3:02 pm
Posts: 2
sorry
I haven't seen your index


Top
 Profile  
 
 Post subject:
PostPosted: Sat Nov 06, 2004 6:04 pm 
Regular
Regular

Joined: Mon Oct 06, 2003 7:17 am
Posts: 58
Location: Switzerland
http://www.hibernate.org/193.html

Reto


Top
 Profile  
 
Display posts from previous:  Sort by  
Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 4 posts ] 

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
© Copyright 2014, Red Hat Inc. All rights reserved. JBoss and Hibernate are registered trademarks and servicemarks of Red Hat, Inc.