These old forums are deprecated now and set to read-only. We are waiting for you on our new forums!
More modern, Discourse-based and with GitHub/Google/Twitter authentication built-in.

All times are UTC - 5 hours [ DST ]



Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 22 posts ]  Go to page 1, 2  Next
Author Message
 Post subject: (Very) small 3-tier Winforms application with NHibernate
PostPosted: Tue Aug 07, 2007 12:47 pm 
Regular
Regular

Joined: Thu Nov 23, 2006 10:29 am
Posts: 106
Location: Belgium
Hello,

As promised in an earlier post, I created a small Winforms application using NHibernate and also remoting.

The application is just a small example to demonstrate a strategy for handling the session management in a winforms application. This application as a whole is certainly not a best practice for building enterprise-grade applications (you will notice there's no validation nor error-handling code) !

Nevertheless, I think the one-session-per-atomic-data-transaction is the best approach in Winforms applications. I hope the sample may convice you. Notice that all classes and collections are lazy-loaded.

The fact that it's a three-tier application using remoting is not relevant to this approach. By this I mean this approach is not only suitable for three-tier applications, it works just as well for fat clients.
But on the other hand, this approach is the only possible one if you want to build three-tier applications (unless I'm very much mistaken).

Well, you can download it here. You'll find a readme.doc in the .zip file with some installation instructions.

Enjoy,
X.


Top
 Profile  
 
 Post subject:
PostPosted: Tue Aug 07, 2007 1:02 pm 
Beginner
Beginner

Joined: Wed Mar 22, 2006 6:59 am
Posts: 30
xasp,

really, really thank you! Downloading right now...


Top
 Profile  
 
 Post subject:
PostPosted: Tue Aug 07, 2007 2:15 pm 
Regular
Regular

Joined: Fri Jan 27, 2006 2:32 pm
Posts: 102
Location: California, USA
Thank you for posting this!

Of course, I feel a bit dumber now and am thinking I don't understand Nhibernate as well as I thought I did.

In your DAO class, you are saving with this method:

Code:
        public void SaveUpdateOrder(ref Order order)
        {
            using (ISession session = DAManager.SessionFactory.OpenSession())
            {
                session.SaveOrUpdate(order);
                session.Flush();

                Console.WriteLine(string.Format("Save/Update of '{0}'.", order));
            }
        }


How does NHibernate know what has changed? Would you recommend that the Domain Objects keep track of their "dirty" status? Or, have the form itself try to keep track if changes have been made?


Top
 Profile  
 
 Post subject:
PostPosted: Tue Aug 07, 2007 3:02 pm 
Newbie

Joined: Wed Jul 04, 2007 5:31 pm
Posts: 17
Please consider simple scenario:

+ You have a form that edits Customer or other domain object.
(this domain object has associations and collections)

+ User made some changes to domain object or its associations in this form.

How do you handle Cancel button on form?
(it suppose to discard changes
made to domain object).

I've tried to use ISession.Refresh but it doesn't handle object's associations. I've tried to use ISession.Evict but it detaches all objects from session.

Any ideas?


Top
 Profile  
 
 Post subject:
PostPosted: Tue Aug 07, 2007 3:05 pm 
Regular
Regular

Joined: Thu Nov 23, 2006 10:29 am
Posts: 106
Location: Belgium
Hi Pelton,

You are right: NHibernate does not know what has changed, and will create an update statement for all columns, even those that haven't changed.

On the one hand: if you're dealing with small classes and records, it really won't make much of a difference performance-wise.
On the other hand: suppose you have a class with some large objects (byte-arrays, large strings,...) that you don't want constantly being sent over from your application server to your database server, this is how I would implement it:

Create a Copy() method for your class that creates a duplicate of the class.
Create a Merge() method for your class that assigns all public methods of the class to the class that is passes as a parameter.

On the client-side, when you start editing an instance, create a Copy() of the original before you edit it. Then, send both the original and the copy back to the server, lock the original, merge it with the copy and SaveOrUpdate() the original.

I know it sounds weird, but after all that's exactly what happens with DataTables. That's why you always have the orginal, the deleted, the inserted, ... rows in a DataTable until you call AcceptChanges().

I posted an update of the sample. Same URL, just re-download the .zip. Check out the update of Customers: this now works according the explanation here above.

You might argue that it is easier just to re-fetch the object from the database and Merge() that one instead of always sending two (potentially heavy) objects over the network, but if you do that, you are throwing your concurrency control overboard.

Frankly, I find this whole approach too heavy and I just SaveOrUpdate() the object without minding which properties were dirty and which ones were not. Until now, I never had a noticeable performance degradation.

_________________
Please rate this post if it helped.

X.


Top
 Profile  
 
 Post subject:
PostPosted: Tue Aug 07, 2007 3:13 pm 
Regular
Regular

Joined: Thu Nov 23, 2006 10:29 am
Posts: 106
Location: Belgium
(To dmk)

Funny, I just posted my answer to Pelton while your message arrived. It seems the updated sample might already have a solution to your question.

Re-download the sample, you might find your answer in the Clone() / Merge() mechanism.
In your case: when the user clicks Cancel, just throw away the copy and re-instate the original.

Again: I find this approach too heavy. In my application(s) most of my DAO's implement a .InitializeForEdit() that preps my object for editing. When the user cancels, I just refetch the object and re-initialize it. That way, if another user changed the data in the meantime, I'm sure the data shown is the most recent (I don't like keeping data too long around on the clientside. Too much chances it's stale data).

_________________
Please rate this post if it helped.

X.


Top
 Profile  
 
 Post subject:
PostPosted: Tue Aug 07, 2007 11:09 pm 
Newbie

Joined: Wed Jul 04, 2007 5:31 pm
Posts: 17
Thanks. But Clone/Merge requires manual implementation, i.e.
for each object we have to implement cloning procedure. Not sure how
that would work with complex objects, which may be roots of big graphs.


What i'm looking for is a generic way to discard changes made to object
and its associations.


Top
 Profile  
 
 Post subject:
PostPosted: Wed Aug 08, 2007 1:08 am 
Regular
Regular

Joined: Tue Feb 21, 2006 9:50 am
Posts: 107
A simple way to clone an object in a generic way is this:

Code:
      /// <summary>
      /// Clones the object by using serialization and deserialization.
      /// </summary>
      /// <returns>
      /// A deep copy of the object.
      /// </returns>
      private object cloneBySerialization()
      {
         MemoryStream buffer = new MemoryStream();
         Object result;

         BinaryFormatter formatter = new BinaryFormatter();
         formatter.Serialize(buffer, this);
         buffer.Position = 0;
         result = formatter.Deserialize(buffer);
         buffer.Close();
         return result;
      }


It works fine as long as your object graph only contains standard .NET types and collections (Nullables and Iesi.Collections included).

Regards
Klaus

(Note that the code is .NET 1.1)


Top
 Profile  
 
 Post subject:
PostPosted: Wed Aug 08, 2007 2:30 am 
Regular
Regular

Joined: Thu Nov 23, 2006 10:29 am
Posts: 106
Location: Belgium
(to DMK)

You might want to check out the CSLA framework. If I remember correctly, there is an undo feature for domain objects. It works by cloning the objects, quite in the same way Klaus proposes, by serialization.

_________________
Please rate this post if it helped.

X.


Top
 Profile  
 
 Post subject:
PostPosted: Wed Aug 08, 2007 12:08 pm 
Newbie

Joined: Wed Jul 04, 2007 5:31 pm
Posts: 17
Thanks for your advices. CSLA is not an option for me, but i may use luedi code.

In general, Refresh should be cascaded, just like it does in Hibernate 3.0.
How can i file bug/feature request to hibernate team?


Top
 Profile  
 
 Post subject:
PostPosted: Wed Aug 08, 2007 5:49 pm 
Beginner
Beginner

Joined: Sat Jul 21, 2007 3:56 pm
Posts: 27
Hi xasp,

Thanks a lot for this nice example. Being new to .NET, it is really interesting to see what patterns others use to solve their problems.

Looking through the code, I have some questions I would like your advice about:
- I see that you use remoting in a "service-oriented" manner, having your middle tier expose some atomic and stateless methods. The only difference I see with a real SOA architecture is that you provide real business entities (i.e. data + behaviour) instead of simple DTO's to your client. Does this also mean that your business logic, implemented in your business entities, would run on the client? If not, why would you send it in the first place? If yes, what is then the purpose of the middle tier?

- As I understood in your previous comments, you would stick with the approach of creating a new session for each atomic activity, even if your application has only a two-tier architecture. This means that you lose a lot of very useful functionality that the session can give you, like caching, dirty tracking, ... All these things you will have to implement yourself if you don't use long sessions. While I agree that in a three-tier / SOA architecture the improved scalability comes at the cost of lower interactivity, I don't see the point of it in a two-tier application.

Regards,

Tolomaüs.


Top
 Profile  
 
 Post subject:
PostPosted: Thu Aug 09, 2007 3:45 am 
Regular
Regular

Joined: Thu Nov 23, 2006 10:29 am
Posts: 106
Location: Belgium
Hello Tolomaüs,

I'm very happy to read your feedback.

I agree I'm sending business objects to the client, but for such simple objects I hardly see the point in creating separate DTO's just to send the data across. All the 'heavy' business logic like validation, calculations, and so on will be handled by the DAO's or by another server-side 'manager' and will not be implemented in the business objects themselves. My aim is that business objects are as dumb as possible, and that all the business logic will be implemented by some other class(es).

For example: I could implement validation by creating a method Validate() on the Customer object itself. But by doing this, you are right by assuming the business logic runs on the client.
But I could also create a CustomerManager class that implements a Validate(Customer customer) method that is exposed by the server and is executed on the server. That way, my Customer class doesn't have to implement business logic and is reduced in part to a simple DTO so I don't have to bother creating DTO's for each class in my business domain.

But you are right: they're not real DTO's, the facade layer is aware of the fact that there's an Order object, and that that Order object contains OrderLine objects.

So I can only answer you that this sample application is not meant to be a 'best practice' as a whole, except for the part about keeping your session open for as short a time as possible.
In real life, I do not apply any known fixed pattern to my applications: I tend to choose for what works best in a given situation and this comes mostly down to a mix of several known patterns. I like the pragmatic approach mixed with a healthy dose of proven practices. This being said: I'm not advocating my way is the best.

About your second comment: I started to build my application in a two-tier fashion, but with the basic approach of having 1 session for each atomic data-access part. I had a lot of doubts at the time if this was the best approach, especially given the fact that I wanted the lazy-loading of my objects to stay enabled.
But my application was growing and I needed scalability and that's when I was truly happy I kept to that approach, because I had a lot of work refactoring the application to make it three-tier enabled, but at least it was feasible.
Unless you are 100% sure from the start what your requirements are and will be, I believe it's always a good thing to stick with this approach.

Just a detail: you can compensate the loss of caching of your session by using the second-level cache. In my sample application, I'm using it for the Product class.

I believe every rule has its exceptions (that's why I never stick to one fixed pattern), so has probably this one.

_________________
Please rate this post if it helped.

X.


Top
 Profile  
 
 Post subject:
PostPosted: Sun Aug 12, 2007 6:57 am 
Beginner
Beginner

Joined: Sat Jul 21, 2007 3:56 pm
Posts: 27
Hello again xasp,

Thanks for the clear explanation. I have been thinking about it for a couple of days (especially about the two-tier / three-tier comparison) and I think now that you're right: in all but the most complex applications the "service-based" approach is the way to go. It is clean and it allows for the evolution towards a three-tier model without having to rebuild everything from the ground up.

I have been thinking about some good examples of "complex" applications, where this approach would not work and I came up with the following:

- highly interactive applications with a fine-grained interface between the presentation layer and the business logic. E.g. the user provides some data -> the business logic manipulates it and gives back an intermediate result -> based on this result, the user provides some other data -> etc... - all this without storing the intermediate results in the database.
Performance-wise, the fine grained interaction would already make it unsuitable in a three-tier model and moreover, it seems that the service-based methods directly store their results into the database, making them visible for others, which is not the desired behaviour in this case.

- applications where multiple atomic activities can be combined into one transaction, in the sense that, if one activity fails, the others should fail too. A simple example of this would be a form with a grid of projects that can be edited inline and one save button that saves all of them in one transaction. I guess I would have to add a specific method that accepts multiple "project"-objects on myservice layer.
But it could also be an arbitrary combination of x times activity A, y times activity B, ... that should be combined into one transaction. E.g. I create a new project and therefore I need to create a new projecttype. If the creation of the project fails, I don't need the projecttype either, so both activities must be done in the same transaction.
In a three-tier model this would mean that I have to create a specific method for each combination of atomic activities.

I have to admit that these examples are rather far-sought, but - at least in our company - you are never sure that they will not appear as requirements for the application. And I would have a lot of difficulties explaining to the users that they cannot get this functionality because the application might one day become three-tier.

So, as a conclusion, I agree with you that each problem has its own optimal solution.

Regards,

Tolomaüs.


Top
 Profile  
 
 Post subject: Re: (Very) small 3-tier Winforms application with NHibernate
PostPosted: Tue Feb 19, 2008 3:21 pm 
Newbie

Joined: Thu Dec 06, 2007 10:14 am
Posts: 6
xasp wrote:
Hello,

Nevertheless, I think the one-session-per-atomic-data-transaction is the best approach in Winforms applications.

Enjoy,
X.


Hello there!

I've written an article on the subject a few days ago and put it on the CodeProject. From what I've seen the one-session-per-transaction can be maintained with little hassle - esp. when using a framework such as Spring.Net.
For those of you who are interested, in this article I've also included a codegen template that takes care of your domain objects, dal, service and presentation layers (though obviously doing it per table, so it can't be really used for big projects, but it can give you some idea...)


Top
 Profile  
 
 Post subject:
PostPosted: Sat Jul 26, 2008 6:49 am 
Newbie

Joined: Tue Jul 22, 2008 5:10 am
Posts: 3
xasp wrote:
(To dmk)

Funny, I just posted my answer to Pelton while your message arrived. It seems the updated sample might already have a solution to your question.

Re-download the sample, you might find your answer in the Clone() / Merge() mechanism.
In your case: when the user clicks Cancel, just throw away the copy and re-instate the original.

Again: I find this approach too heavy. In my application(s) most of my DAO's implement a .InitializeForEdit() that preps my object for editing. When the user cancels, I just refetch the object and re-initialize it. That way, if another user changed the data in the meantime, I'm sure the data shown is the most recent (I don't like keeping data too long around on the clientside. Too much chances it's stale data).



Hello,

Could you provide an example of what your .InitializeForEdit() looks like?

I *need* to accomplish some form of 1 level undo and I've been struggling for a bit... Evict, Refresh, Lock, nothing seems to work...

edit:
Seems like I was having problems with my session lifetime... which is why Refresh wasn't working...But it is now. Is that basically what you have in your "undo" routine?


Top
 Profile  
 
Display posts from previous:  Sort by  
Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 22 posts ]  Go to page 1, 2  Next

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
© Copyright 2014, Red Hat Inc. All rights reserved. JBoss and Hibernate are registered trademarks and servicemarks of Red Hat, Inc.