These old forums are deprecated now and set to read-only. We are waiting for you on our new forums!
More modern, Discourse-based and with GitHub/Google/Twitter authentication built-in.

All times are UTC - 5 hours [ DST ]



Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 10 posts ] 
Author Message
 Post subject: Survey: domain model - <bad/good pattern> and <stup
PostPosted: Wed Feb 22, 2006 4:23 am 
Regular
Regular

Joined: Tue Mar 15, 2005 12:38 pm
Posts: 73
Location: Bucharest
I would like to know everybody's / anybody's opinion on the following questions:

    1. Domain model - good/bad pattern?
    Are you using a domain model pattern with NH?

    architecture:
    presentation - business layer/facade - domain model - data layer

    Is the domain model the same as you persistent classes?
    2. How much are you separating layers in an nTier
    - no domain model objects get to the presentation (send/receive dedicate DTO)
    - domain model objects can go the the presentation layer and return with data
    3. How smart is your domain model?
    (see http://www.martinfowler.com/bliki/AnemicDomainModel.html)

    - no logic at all
    - some logic (eg validations, domain logic but not related to the entire domain model - just some child entities)
    - smart (eg. validation, logic of working with the entire model, task coordination, processing etc.)
    For this last case how do you handle performance (lazy loading, loading just a specific graph for processing logic)


Thank you,
Dragos

PS: I would also be interested in any other approaches( DAO for example)


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 22, 2006 4:56 am 
Senior
Senior

Joined: Thu Aug 25, 2005 3:35 am
Posts: 160
Okay, I really hope this thread takes off, because I would be interested in opinions as well. To kick off, here are my views on the subject matter ;-)

1. Domain model is the reason I use NH. Datasets just don't cut it when you have a lot of business logic
Architecture is layered.. client presentation layer, service layer where request come in, process object that handles those requests (one process object per domain object) : this object encapsulates most of the logic and intelligence.
The domain model are my persistent classes as recommended by the hibernate team.
Which answers question 2: my domain objects can go to the presentation layer and return with data.

3. the model has some logic, all validation (which then works on both the client and the server) and domain logic.
All real processing is done by the server-based process objects.


and i forgot: all nhibernate access is located inside dao objects, one dao per domain object.

the approach works really well for us


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 22, 2006 5:29 am 
Expert
Expert

Joined: Thu Jan 19, 2006 4:29 pm
Posts: 348
Well, my opinions:

1. Yes. Almost as TheShark: peresentation layer, facade (server interface), Command Server, domain model and Data Access Objects (DAO). Command Server is built to behave as command pattern. The command server Object has no direct relationship with Domain Object.

Command server object additional responsibilities are transaction handling, authorization.

2. No. I do not want to make presentation layer aware of domain layer structure. For example, if I make same string field localizeable in Domain Model Object (DMO), the presentation layer does not need to know much about it. (Only the screen where localized names are inputted needs to know the fact). Also, DMO can use DAO, but DAO objects are not available at client side.

3. Domain model contains some logic - mainly validation, but also cascading changes to related objects (no mastter if child, parent or associated). Still, command processing logic is encapsulated into command layer, and NHibernate speicific logic is in DAO. (So that DM does not know anything about command beeing executed)

Edit: "Mainly validation" is misleading: DM includes validation, cascading changes (all rules to keep DM state up-to-date) plus calculations.

Gert


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 22, 2006 9:11 am 
Regular
Regular

Joined: Tue Mar 15, 2005 12:38 pm
Posts: 73
Location: Bucharest
My comments are:

for TheShark
1. one process for each object does not covert that much. Lets take the invoice that has lines (both are entities). The process should than handle both objects and expose a facade/gateway for working with both.
2. a domain model that ends up in the presentation can perform a lot of background logic (such as cascading and lazy init) and exposes all functionality implemented in the domain objects. In the view of a nTier separation the layers should communicate through DTO-s between physical layer (and not only)
3.See comments on exposing logic if you exposes logic and object between layers you're creating a tight coupled system if you are not than it's dumb :(.

Also a presentation should not know how to assemble entities or business rules. A good design presentation IMHO is when changing from webforms to winforms we recreate the code without copy/pasting anything other that single stateless calls (not always possible).


for gert
1. I see a better separation of concerns
2. totally agree with DTO but this raises a problem:
It's easy to say but costly to do. Creating strong typed DTOs, assemblers and/or any other classes just to transfer data takes long. If DTO are not string typed (named dictionary for eg) you're never sure where bugs will appear. So the problem is how to be very very productive and still use custom objects for inter layer exchange.
3.Your domain model


for both TheShark and gert
- how efficient is validation for example if it traverses lazy loaded associations or performs some domain logic on such associations? IMHO this can be totally inefficient, unless tuned carefully. But than were is this tuning done, who should know what data is needed to perform some domain logic such that to instruct NH to retrieve the graph of objects as optimally as possible ?

I can't give solutions, at least not right now, but as soon as I shape them in my mind I will share... Hope I didn't offend, it's just my opinion (or misunderstanding)

Dragos


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 22, 2006 9:40 am 
Expert
Expert

Joined: Thu Jan 19, 2006 4:29 pm
Posts: 348
dado2003 wrote:
2. totally agree with DTO but this raises a problem:
It's easy to say but costly to do. Creating strong typed DTOs, assemblers and/or any other classes just to transfer data takes long. If DTO are not string typed (named dictionary for eg) you're never sure where bugs will appear. So the problem is how to be very very productive and still use custom objects for inter layer exchange.

Unfortunatly this is SO true... My solution was to write a custome code generator wich handles DTO definitions and also coping data from DMO to DTO. Coping data back from DTO to DMO is a little bit harder... At moment I write this code by hand.
Quote:
- how efficient is validation for example if it traverses lazy loaded associations or performs some domain logic on such associations? IMHO this can be totally inefficient, unless tuned carefully. But than were is this tuning done, who should know what data is needed to perform some domain logic such that to instruct NH to retrieve the graph of objects as optimally as possible ?


I haven't started tuning yet. But I would think that there are many possible places:
1. The plcae of validation logic itself. I can instruct DAO to retrive as much of data as needed for current validation.
2. If the previous place does not provide perfomance good enough, I can move (or actually replicate) the tuning code un upper levels: the Commad Server beeing highest.

Usually, as one command means one session (and one transaction), lazy loading works just fine. Also, I have been able to avoid need to store DM objects in server-side session cache. But, as I said, no serious perfomance tuning done yet...


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 22, 2006 10:16 am 
Senior
Senior

Joined: Thu Aug 25, 2005 3:35 am
Posts: 160
dado2003 wrote:
for TheShark
1. one process for each object does not covert that much. Lets take the invoice that has lines (both are entities). The process should than handle both objects and expose a facade/gateway for working with both.


My 'process' objects are basically wrappers around an domain object that offer additional functionality. They exist because I do not wish to expose process logic on the client. Therefor, once they reach the server, they are immediately constructed with their domain object.
The example of an invoice: if the process object of invoice needs to do something to the line object, it would not do it to the object, but to a process object it constructs with the line object.



dado2003 wrote:
2. a domain model that ends up in the presentation can perform a lot of background logic (such as cascading and lazy init) and exposes all functionality implemented in the domain objects. In the view of a nTier separation the layers should communicate through DTO-s between physical layer (and not only)

I agree on your points, and if I would have been so naive as to implement my domain objects like you say, I would indeed have had to use DTO's.
However, like I said, my domain objects do not expose much functionality except acting on it's own state and validation. So I don't feel I have much functionality to expose nor does our situation require such seperation.
It will never trigger any lazy init's since I do not use the nhibernate proxies on the server side. (I'm passing them as webservice and do not have nhibernate references on the clients).

dado2003 wrote:
3.See comments on exposing logic if you exposes logic and object between layers you're creating a tight coupled system if you are not than it's dumb :(.

You are starting to offend just slightly here. ;-)
But yes, we are building a system where we have both client as server under our control. All endpoints will be used by our own clients. When we start to expose certain webservices for partners, they will not be based on our domain objects....

dado2003 wrote:
Also a presentation should not know how to assemble entities or business rules. A good design presentation IMHO is when changing from webforms to winforms we recreate the code without copy/pasting anything other that single stateless calls (not always possible).

We are building a big system and need a rapid development experience. Business rules that are bigger then simple type validation are delegated to the proces objects on the server.
Although we are not planning on porting as asp.net webforms (the client is Avalon btw, we are planning on delivering wba's), the client consists of extremely simple markup all done in xml. So porting to the web should not be a problem at all.

dado2003 wrote:
2. totally agree with DTO but this raises a problem:
It's easy to say but costly to do. Creating strong typed DTOs, assemblers and/or any other classes just to transfer data takes long. If DTO are not string typed (named dictionary for eg) you're never sure where bugs will appear. So the problem is how to be very very productive and still use custom objects for inter layer exchange.

Our loadtests pointed to the reflection mechanism in nhibernate as being the slowest part. Now they are using dynamic emits to fill the objects, it should be much better.
Imho there are only a few situations where you can not use your domain objects as dto's. It all depends on how you set things up and what you let the domain object do. Let's face it, most of the datastructure that would define your DTO _is_ in your domain object. So I would always recommend staying away of using dto's unless you really need to.


dado2003 wrote:
for both TheShark and gert
- how efficient is validation for example if it traverses lazy loaded associations or performs some domain logic on such associations? IMHO this can be totally inefficient, unless tuned carefully. But than were is this tuning done, who should know what data is needed to perform some domain logic such that to instruct NH to retrieve the graph of objects as optimally as possible ?


Was answered in previous points. If I were to traverse an association that could be lazy-loaded on the client, I would find an empty (not null) collection. Validation on this end of the spectrum is done on the server.


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 22, 2006 11:33 am 
Regular
Regular

Joined: Tue Mar 15, 2005 12:38 pm
Posts: 73
Location: Bucharest
First of all this shows two things:
- first of I jumped to conclusions to soon
- you weren't very clear with the explanation first time
( lets split the guit :) )



TheShark wrote:
dado2003 wrote:
for TheShark
1. one process for each object does not covert that much. Lets take the invoice that has lines (both are entities). The process should than handle both objects and expose a facade/gateway for working with both.


My 'process' objects are basically wrappers around an domain object that offer additional functionality. They exist because I do not wish to expose process logic on the client. Therefor, once they reach the server, they are immediately constructed with their domain object.
The example of an invoice: if the process object of invoice needs to do something to the line object, it would not do it to the object, but to a process object it constructs with the line object.


OK, so i get it that the union process+entity is in fact and "object" in the domain model and not the entity by itself...
Part of my first answer was because I use the terms process in a slightly different context (internal framework habits).
But doesn't this introduce just too much classes? My approach is similar to yours on this but it struggle to make the entities be smarter so "code less but develop more".


TheShark wrote:
dado2003 wrote:
2. a domain model that ends up in the presentation can perform a lot of background logic (such as cascading and lazy init) and exposes all functionality implemented in the domain objects. In the view of a nTier separation the layers should communicate through DTO-s between physical layer (and not only)



I agree on your points, and if I would have been so naive as to implement my domain objects like you say, I would indeed have had to use DTO's.
However, like I said, my domain objects do not expose much functionality except acting on it's own state and validation. So I don't feel I have much functionality to expose nor does our situation require such seperation.
It will never trigger any lazy init's since I do not use the nhibernate proxies on the server side. (I'm passing them as webservice and do not have nhibernate references on the clients).


I have read very carefull you answer and the not lazy was not mentioned and also I figured that the entities expose more logic. But IMHO lazyiness is a too good feature to ignore and such a model (that uses lazy) cannot be exposed in presentation. Question: if you don't use lazy this means that you map fk's as values or as entities? If they are entities that are not lazy doesn't this trigger just too much joins/selects?
I'm fully using lazy and fetch styles to create a default behavior of the model and than override them if necessary in queries to shape the resultset to best DB performance.

TheShark wrote:
dado2003 wrote:
3.See comments on exposing logic if you exposes logic and object between layers you're creating a tight coupled system if you are not than it's dumb :(.

You are starting to offend just slightly here. ;-)
But yes, we are building a system where we have both client as server under our control. All endpoints will be used by our own clients. When we start to expose certain webservices for partners, they will not be based on our domain objects....


Sorry for that, didn't mean to...
For me keeping entities in domain model and DTO for sending/receiving data is better because I can shape results (cut/add properties, collections etc). Another reason to use custom DTOs is also because we mapp a full graph (almost all collections are mapped and *all* reference of many-to-one). This gives me the liberty to do everything on the middle tier(s) and exchange only specific, dumb (somethimes very dumb) data to presentation.

TheShark wrote:
dado2003 wrote:
Also a presentation should not know how to assemble entities or business rules. A good design presentation IMHO is when changing from webforms to winforms we recreate the code without copy/pasting anything other that single stateless calls (not always possible).

We are building a big system and need a rapid development experience. Business rules that are bigger then simple type validation are delegated to the proces objects on the server.
Although we are not planning on porting as asp.net webforms (the client is Avalon btw, we are planning on delivering wba's), the client consists of extremely simple markup all done in xml. So porting to the web should not be a problem at all.


The discussion was not specifically about how the presentation is build, rather about how much does presentation know about "internal affairs" and if a method in the presentation ussualy does more than validating some input and making 1-2 calls to middle tier.

TheShark wrote:
dado2003 wrote:
2. totally agree with DTO but this raises a problem:
It's easy to say but costly to do. Creating strong typed DTOs, assemblers and/or any other classes just to transfer data takes long. If DTO are not string typed (named dictionary for eg) you're never sure where bugs will appear. So the problem is how to be very very productive and still use custom objects for inter layer exchange.

Our loadtests pointed to the reflection mechanism in nhibernate as being the slowest part. Now they are using dynamic emits to fill the objects, it should be much better.
Imho there are only a few situations where you can not use your domain objects as dto's. It all depends on how you set things up and what you let the domain object do. Let's face it, most of the datastructure that would define your DTO _is_ in your domain object. So I would always recommend staying away of using dto's unless you really need to.


It was not about the time to fill DTO with data but about the time to code DTOs (create classes/structs, properties, maintain them). This is what I want to tackle. Also, for me there are enough cases where edditing/viewing is done on a graph involving 3-4 classes so that exposing such a graph is harder to use/maintain than some simple views with just needed data.

TheShark wrote:
dado2003 wrote:
for both TheShark and gert
- how efficient is validation for example if it traverses lazy loaded associations or performs some domain logic on such associations? IMHO this can be totally inefficient, unless tuned carefully. But than were is this tuning done, who should know what data is needed to perform some domain logic such that to instruct NH to retrieve the graph of objects as optimally as possible ?

Was answered in previous points. If I were to traverse an association that could be lazy-loaded on the client, I would find an empty (not null) collection. Validation on this end of the spectrum is done on the server.



Yap, we just have different views on this lazy problem.

So as a conclusion:
- I see more benfits with lazy loading (altough it can complicate the problem)
- I did't understood if you map association to entities or just the value of fk
- I want to "code less accomplish more" (this is in fact contrary to all the DTO's I'm writting about, this is wy they are just structs with public members) your approach is verbose on the process part, mine of the DTOs
- I seems to much to code in order to encapsulate each entity within a process and I guess that if Order has collection of Lines than OrderProc has collection of LinesProc...again, (if it is so) isn't it error prone if not very carrefull with theese?

Sorry again, and don't get me wrong :)


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 22, 2006 11:46 am 
Regular
Regular

Joined: Tue Mar 15, 2005 12:38 pm
Posts: 73
Location: Bucharest
Gert,

My solution on DTO (not yet applied because I'm evaluating it):
- create view DTO (for grids/lists) using select new or performing transformation on results and the DTO's are just structs with public members
- create edit DTO (for updating/inserting) either manually for special casses or using a sort of a UI-mapper to create some structure that can be easility, and automatically translated on the other side of the wire.

About validations:
Yes, lazy works fine but it is a pitfall in a big team because I saw pages that generate 70-80 selects :(...try figuring this ... if the higher level bring more data in order to ease validation process and gain performance than this makes the two layers (Cmd server) and DM just to tightly coupled... I see here two posiblities:
- DM objects describe the needed graph in a generic approach (but here I see a maintenance nightmare)
- DM objects are dumb and contain just general validation rules (context validation is done is upper layers as well as task coordination and processing)..as Shark said although his solution in more than this

My approach is smth like the second option, proceses work with an aggregate of entities, know about validation in regard with state (eg an invoice is valid for save but is not valid to be shiped so we have two contexts), processing strategies and others. But, as I told Shark, I'm trying to tackle with this and improve this...


Top
 Profile  
 
 Post subject:
PostPosted: Wed Feb 22, 2006 2:38 pm 
Senior
Senior

Joined: Thu Aug 25, 2005 3:35 am
Posts: 160
dado2003 wrote:
First of all this shows two things:
- first of I jumped to conclusions to soon
- you weren't very clear with the explanation first time
( lets split the guit :) )


;-)



dado2003 wrote:
OK, so i get it that the union process+entity is in fact and "object" in the domain model and not the entity by itself...
Part of my first answer was because I use the terms process in a slightly different context (internal framework habits).
But doesn't this introduce just too much classes? My approach is similar to yours on this but it struggle to make the entities be smarter so "code less but develop more".


Yes, it does introduce a lot of classes. In our previous project, we used a horizontal layering (with datasets). We basically had 3 objects per information area (comprised of many, many datasets): data, proces and service layer.
There we saw an incredible amount of methods on each object which really works against you (not to mention being the opposite of OO).
So time will tell if we don't get too many classes with this new approach and that that will bite us just as much. At the very least we will have good encapsulation though ;-) Until now we don't consider this to be a problem though.
To help in constructing these objects, we use a container (castle). This does help as well.


dado2003 wrote:
I have read very carefull you answer and the not lazy was not mentioned and also I figured that the entities expose more logic. But IMHO lazyiness is a too good feature to ignore and such a model (that uses lazy) cannot be exposed in presentation. Question: if you don't use lazy this means that you map fk's as values or as entities? If they are entities that are not lazy doesn't this trigger just too much joins/selects?
I'm fully using lazy and fetch styles to create a default behavior of the model and than override them if necessary in queries to shape the resultset to best DB performance.

Indeed, I was a bit short in my first explaination.
Basically, what we do is this: we map the way we want (using lazy and everything). So we can query and use our objects on the server, just like you can. For us, lazyness is also a feature too good to ignore, so we make great use of it. (I should mention the focus of the application is on long business processes on the server).
When it is time to get data to the client, we use the (de-)serialization capabilities of Indigo (as opposed to remoting, like we have done in previous projects). We simply check if a collection is lazy loaded, and if so, we will not serialize it: thereby not triggering lazy loading when we are at the client.
On the client we automatically build up the domainobjects again with empty collections, where the non-initialized proxies used to be.

The only problem we have here is with building up collections on the server again, when we return there. So we keep track of deleted entities, and manually do a session.delete on them.
What I am really missing in nhibernate at this moment, is a good way to manipulate the nhibernate.bag etc.
Still working on a better solution for this.



dado2003 wrote:
Sorry for that, didn't mean to...
For me keeping entities in domain model and DTO for sending/receiving data is better because I can shape results (cut/add properties, collections etc). Another reason to use custom DTOs is also because we mapp a full graph (almost all collections are mapped and *all* reference of many-to-one). This gives me the liberty to do everything on the middle tier(s) and exchange only specific, dumb (somethimes very dumb) data to presentation.

This is a trade-off.
I'm guessing using custom dto's (per UI screen possibly?) could be easier.

I'm thinking of going for a loadstrategy enum per root-aggregation object that will be called. The loadstrategy will map 1-1 on named queries, filling exactly the right amount of the graph.

dado2003 wrote:
The discussion was not specifically about how the presentation is build, rather about how much does presentation know about "internal affairs" and if a method in the presentation ussualy does more than validating some input and making 1-2 calls to middle tier.

Okay, I misunderstood then. There is a possibility I still do actually ;-)

Our client is mostly about editing data and saving. The validation is pretty lightweight, but that will not always be the case.
The way I connect validation to properties on the datamodel is by metadata. ( I used to have attributes, but that was a bit chaotic, now I use the same pattern that microsoft does and that is to query the object for validation rules and letting him spit it out). If I come across specific validation rules that I do not want to perform on the client, I will most likely create two classes of validation rules, client and server side. The server would validate both the client rules as the server rules.



dado2003 wrote:
It was not about the time to fill DTO with data but about the time to code DTOs (create classes/structs, properties, maintain them). This is what I want to tackle. Also, for me there are enough cases where edditing/viewing is done on a graph involving 3-4 classes so that exposing such a graph is harder to use/maintain than some simple views with just needed data.


Okay, I can see the use of that. It was a major consideration for me as well. I have big object graphs as well, but per screen I do not perse want to be bothered with large parts of those graphs.
I can recall there is a transformation tool for DO <-> DTO. It was being built by the ppl behind NPersist. But I can't find a link for it right now.
But I have the feeling you are about to present such a tool yourself soon.. ? :-)


dado2003 wrote:
- I seems to much to code in order to encapsulate each entity within a process and I guess that if Order has collection of Lines than OrderProc has collection of LinesProc...again, (if it is so) isn't it error prone if not very carrefull with theese?

Yep ;-)
It's not an automatic process btw. So I will just create a lineProc object when I'm in orderproc and need to do something on an line.

It's been an interesting discussion. I'm off on vacation tomorrow, will be back on tuesday.
I wish there were more best practices and design-considerations like these written down...


Top
 Profile  
 
 Post subject: Business Rules and lazy loading
PostPosted: Wed Jan 24, 2007 11:40 am 
Newbie

Joined: Tue Dec 05, 2006 10:51 am
Posts: 1
Hello,

I just love digging-up old threads :)

I'm exposing services to smart clients (.net) and 'otherworld' clients (java / vb) as webservices, so I'm going the DTO / Remote Facade / Domain Model way.

One thing that worries me is that my business rules may end-up triggering staleobject exceptions when a cascading rule is fired on a lazy domain model object.

Let's consider the following code:

Code:
namespace DomainModel
{
  public class A
  {
    int _foo;
    int _bar;
    B _relation;
    public int Foo {/* standard get/set */}
 
    public int Bar
    {
      get
      {
       return this._bar;
      }
      set
      {
        // 'trigger' business rule:
        if(this._bar != value)
        {
          this._Relation.BarHasChanged(_bar,value); // relation may be lazilly loaded
          this._bar=value;
        }
      }
  }
}
 
a.Foo = 1; // does not trigger any rule
a.Bar = 2; // triggers a cascading rule and a lazy loader initialization on a 'dirty' object


So here's my question to you guys, how do you implement 'Triggers' or 'Cascading' business rules in your domain model when using nhibernate ?

Another approach, as I plan on using declarative rules is to have two-step rule execution:
1/ ask the rule to preload the data it will need to work on
2/ perform the operation

Of course I have considered ditching lazy loading altogether but still, it's a neat toy.

Cheers,


Top
 Profile  
 
Display posts from previous:  Sort by  
Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 10 posts ] 

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
© Copyright 2014, Red Hat Inc. All rights reserved. JBoss and Hibernate are registered trademarks and servicemarks of Red Hat, Inc.