Hi, I have a object with a fairly complex domain model i.e. the usual collections, subclasses and associations going at least two or three levels deep. I am using foreign keys by and large to allow me to form the appropriate associations. We have a requirement to pretty much be able to recreate the DB at any point in time, so whenever an update is performed on a subset of data, we need to create a snapshot of the object as it was. I can do this if I deep copy the object, nulling the primary keys and saving the new version (I assume!). New primary key and foreign key relationships will be set up so that I can recreate my new version of the object as well as the old one. I have all the data in the one table, which has the advantages of: a) not having to define similar history schema b) not having to map two sets of objects c) not having to work with two sets of POJOs However, I'd like to avoid duplication of data somehow, as this will increase exponentially the more levels the object graph has.
When the user wants to update the current version of the object, I think the sequence of events will be: a) retrieve current version (by date query or version, whichever) b) allow user to make updates on UI c) instead of updating object, create a new copy and save it Without doing a deep compare with the previous version, I dont see that I can avoid data duplication here .. and I'd like to avoid a field-by-field comparison (Hibernate would be better at knowing whether an update occurred?) If anyone has anything helpful to say, please do .. I'm sure it's a common problem. We were considering Envers also but weren't sure if it was mature enough to handle anything we threw at it, so if people have had a good experience with it, I'd be interested .. many thanks, Baljeet.
|