-->
These old forums are deprecated now and set to read-only. We are waiting for you on our new forums!
More modern, Discourse-based and with GitHub/Google/Twitter authentication built-in.

All times are UTC - 5 hours [ DST ]



Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 9 posts ] 
Author Message
 Post subject: 'Better Practices'?
PostPosted: Thu May 03, 2007 7:34 pm 
Regular
Regular

Joined: Sun Jan 21, 2007 4:33 pm
Posts: 65
Currently, my app works a bit like this.

User fills out a form with Client information, Child Information, Employer information, Wages information, Spouse information.

Then a session.Save() is called on client, and that object is loaded from the Database. Child, Employer, Wages, and Spouse are then saved in the DB as being related to that object.

There's a better way of doing this, I'm sure. How do you save multiple objects at once? Do you save the parent, then save the children? [/i]


Top
 Profile  
 
 Post subject: Re: 'Better Practices'?
PostPosted: Fri May 04, 2007 7:34 am 
Senior
Senior

Joined: Thu Feb 09, 2006 1:30 pm
Posts: 172
Gortok wrote:
There's a better way of doing this, I'm sure. How do you save multiple objects at once? Do you save the parent, then save the children? [/i]


The real question would be, how are these objects related to each other. I'm going to assume there is something like a children collection on the client object? If so, simply setting cascade="save-update" on your mapping file for that collection means that when you save the client it'll also save the collection of children. cascading save will say if the object is a new object save it, if it is an existing object then update it, or if it is already existing and part of the NHibernate session don't worry about it since it is already associated anyway.

I hope that helps, if you were looking for a different answer let me know.


Top
 Profile  
 
 Post subject:
PostPosted: Tue May 08, 2007 5:25 pm 
Expert
Expert

Joined: Fri Oct 28, 2005 5:38 pm
Posts: 390
Location: Cedarburg, WI
We ended up writing a class that represents a "row" from an HQL query, which keeps the entities backing whatever properties you select (we massage the HQL to select the entities backing the properties the developer says they want to select, and then dynamically generate a class with strongly typed/named properties for the properties requested, to flatten the result set and make binding easy).

We also parse the joins in the HQL and keep track of how the entities are related. This allows us to know what entities to insert/update in what order, since we also have web pages with multiple forms, and we have to perform multiple inserts/updates during a single postback.

It was quite a bit of work, but it works smoothly for us, and simplifies the development of web pages. The developer just needs to paint forms, associate an HQL query with each form, and tie entities selected in different forms to each other to define the master/slave relationships.


Top
 Profile  
 
 Post subject:
PostPosted: Mon May 14, 2007 12:36 pm 
Senior
Senior

Joined: Thu Feb 09, 2006 1:30 pm
Posts: 172
Nels_P_Olsen wrote:
and then dynamically generate a class with strongly typed/named properties for the properties requested, to flatten the result set and make binding easy)


Dynamically generate a class with strongly typed properties? I'm assuming there is some miscommunication going on here, either in how I'm reading this, or how it was written. Dynamically generate meaning generated at run-time (how I interpret dynamically generated) or do you mean you use a code generator based upon the HQL or some other mapping file? If it's generated at run-time then the term "strongly typed" could either cause confusion, or question the advantage of it over just using the array like Hibernate supports (an approach I don't really like).

Also how is this class different than the "row" as you mentioned. We refer to these "rows" as reporting objects, but I suppose everyone can have their own terms.

Nels_P_Olsen wrote:
We also parse the joins in the HQL and keep track of how the entities are related. This allows us to know what entities to insert/update in what order

Nels_P_Olsen wrote:
It was quite a bit of work, but it works smoothly for us, and simplifies the development of web pages.


Why create your own tool to parse HQL strings for order when the tool is already built to do it? NHibernate will take care of inserting records in the correct order. I think people in general have misconceptions regarding what the Save(), Update() and SaveOrUpdate() methods do. Let NHibernate do the work, why develop an elaborate framework around the already elaborate tool?

I'm not saying what you and your team did is wrong, but I am curious as to why you made the decisions that you did.


Top
 Profile  
 
 Post subject:
PostPosted: Tue May 15, 2007 12:59 pm 
Expert
Expert

Joined: Fri Oct 28, 2005 5:38 pm
Posts: 390
Location: Cedarburg, WI
Regarding the dynamically generated "row" classes -- we created the infrastructure to do this because the options NHibernate offers -- either a raw array of selected property values or using the "new" keyword to create an instance of a special class intended to hold the "result rows" of a specfic query -- were unacceptable to us. We had these requirements:

1. Developers need to write HQL queries that select properties from multiple entities. This query should be part of an "entity data source" bound to ASP grids and forms. All forms should support inserting/updating data through them; the HQL query itself should provide sufficient information on how to do this successfully.

2. Developers will need access to the entities backing the properties they select, without fuss. I.e. both the selected properties and the entities backing them should be available in each "row" of the query's result set. The entities need to be accessible by the aliases given to them in the query.

3. Developers want to bind the selected values from their HQL query with a minimum of fuss. This means that they all need to be strongly typed properties on some class. Yet, the developers are not expected to write such a class themselves.

4. Developers will be creating web pages with forms with arbitrary combinations of entities, in which these entities may need either inserting or updating. The developers will not be expected to specify the order in which the entities are inserted/updated. It should just happen correctly, based on information from the HQL query itself.

5. Often developers will select identifier properties from related entities. Their web forms will have dropdowns to let the user select which entity to relate. During postback, the related entity's ID is sent back. What should happen is that if a persistent entity is found with that ID, it should be loaded, otherwise a new entity with that ID is created. In both cases the related entity should get automatically attached to the entity that refers to it, based on the HQL query. The developer shouldn't have to write any code for this to happen.

To achieve all these things, we came up with an infrastructure that generates the code for and compiles "query row" classes when the queries are executed (and also at design time, so the properties can be selected and bound to textboxes or grid columns). These generated classes expose all the selected values in the query as strongly typed properties, provide an IDictionary of the selected entities keyed by their query alias, and also provide metadata on the entities, selected properties, and the query joins. These "query row" classes can be rehydrated from raw ASP postback values, and all rehydrated entities are attached to each other as the HQL query joins define. We also created an "entity data source" in which the developer defines their HQL query, and from which the "query row" objects are exposed.

Achieving #4 in particular requires careful determination of what entities to insert in what order, to avoid NHibernate exceptions about transient objects being referenced when you try to save/update other ones. The entities cannot be inserted in any random order. To achieve this within the infrastructure so that the developer does not need to do it for the specific web form/page they are working on, it is necessary to parse the JOIN clauses of the HQL and then examine the NHibernate class mappings of the entities involved, to determine which entities must be inserted first. In particular, this solves situations where a single web form (and single HQL query) joins two entities in a many-to-many relationship using an explicit third entity. In this case, the entities being related must be inserted first, then the entity that references each.


Top
 Profile  
 
 Post subject:
PostPosted: Tue May 15, 2007 6:52 pm 
Newbie

Joined: Mon Apr 02, 2007 12:31 pm
Posts: 19
Nels,

When you say you're dynamically generating "row" classes, do you mean you're using Reflection.Emit to create classes with the strongly-typed properties you require? Are you then lazy-loading collections within those classes? I'd like to know if you had any problems with either getting proxies to work, or getting NHibernate to use the dynamically-generated classes. I considered this approach for something a few months ago, but it's really overkill for our needs so it never seemed worth the time to do it.


Top
 Profile  
 
 Post subject:
PostPosted: Tue May 15, 2007 7:00 pm 
Expert
Expert

Joined: Fri Oct 28, 2005 5:38 pm
Posts: 390
Location: Cedarburg, WI
We don't use Reflection.Emit, although we probably should -- we use a legacy VSA engine we developed back when .NET 1.0 came out, just before Microsoft announced they were abandoning VSA for .NET. Essentially, we generate the source code, compile it using the C# compiler available in the .NET Framework itself, and then load the resulting assembly.

These generated "row" classes keep references to the entities selected by the HQL query. The strongly typed properties exposed on the "row" classes use reflection to get and set the underlying entity property. When one of the "row" properties gets set, it not only sets the underlying entity property, but it hooks up the underlying entities as needed, based on the query joins.

The "row" classes don't directly manage any entity collections or otherwise track or force lazy loading.

We haven't encountered any proxy problems using this approach in our ASP.NET application.


Top
 Profile  
 
 Post subject:
PostPosted: Wed May 16, 2007 8:15 am 
Senior
Senior

Joined: Thu Feb 09, 2006 1:30 pm
Posts: 172
Nels_P_Olsen wrote:
To achieve all these things, we came up with an infrastructure that generates the code for and compiles "query row" classes when the queries are executed (and also at design time, so the properties can be selected and bound to textboxes or grid columns). These generated classes expose all the selected values in the query as strongly typed properties, provide an IDictionary of the selected entities keyed by their query alias, and also provide metadata on the entities, selected properties, and the query joins. These "query row" classes can be rehydrated from raw ASP postback values, and all rehydrated entities are attached to each other as the HQL query joins define. We also created an "entity data source" in which the developer defines their HQL query, and from which the "query row" objects are exposed.


Why did you choose to dynamically generate the "row class" as you call it instead of doing it statically. I see you state that you do it both at compile time and run-time. Why not have a code generator that runs on the file containing the HQL queries whenever that file is saved? I don't know if that is really elegant or not, but it is interesting that you generate them at compile time to allow the developer to use them easily from the web and then also generate them at run-time. I assume this is to ensure you have the latest query? But wouldn't it be a problem if the query changed since the developer touched the UI code?

Nels_P_Olsen wrote:
Achieving #4 in particular requires careful determination of what entities to insert in what order, to avoid NHibernate exceptions about transient objects being referenced when you try to save/update other ones. The entities cannot be inserted in any random order. To achieve this within the infrastructure so that the developer does not need to do it for the specific web form/page they are working on, it is necessary to parse the JOIN clauses of the HQL and then examine the NHibernate class mappings of the entities involved, to determine which entities must be inserted first. In particular, this solves situations where a single web form (and single HQL query) joins two entities in a many-to-many relationship using an explicit third entity. In this case, the entities being related must be inserted first, then the entity that references each.


But isn't this really what cascade can be used for? We have a simple rule, if persisting a given object should never create a new object we don't set the cascade, if persisting that object is allowed to save the other object we use cascade="save-update". Now we only get the transient exceptions when a developer is "misbehaving" and trying to do something we would never want them to do.

What you did is certainly an interesting approach. I don't think it would work with the approach we took to our objects, but it is interesting none the less.

Thanks for the insight into how some other teams are using the tool.


Top
 Profile  
 
 Post subject:
PostPosted: Wed May 16, 2007 3:49 pm 
Expert
Expert

Joined: Fri Oct 28, 2005 5:38 pm
Posts: 390
Location: Cedarburg, WI
Personally, I pushed for compiling the generated "row" classes for HQL queries at design time, but unfortunately, other team members didn't want to do that. Also, at the time, we were not familiar with coding Visual Studio add-ins, which would be necessary to insert entire generated classes and code files into a project. (We couldn't see any way to insert entire generated source files into a project, or even classes into existing source files, using designers.)

Regarding the insertion (save) order for entities -- NHibernate's cascade behavior can't help here, because explicit calls to ISession.Save are required to make the transient entities persistent. In order to automatically save the transient entities in the correct order, so that the developer of individual ASP pages would not have to do so, we had to examine the joins between the entities. When you have two transient entities where one references the other, you must save the one on the "primary" end of the foreign key first, then attach the now-persistent "primary" entity to the "foreign" one, and finally save the "foreign" one. Normally some developer writes ISession.Save statements in a particular order in their code; we wanted to handle it automatically.


Top
 Profile  
 
Display posts from previous:  Sort by  
Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 9 posts ] 

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
© Copyright 2014, Red Hat Inc. All rights reserved. JBoss and Hibernate are registered trademarks and servicemarks of Red Hat, Inc.