I have a coarse-grained object, Review, that is configured by the user in a series of steps:
1. Create a Review by defining it's scope (product lines etc) and identity (author, date, etc).
2. Upload XML data files.
3. Set some comparison parameters.
4. Apply a series of rules to each item under review (Bulk).
5. Page through the items which failed to meet the rules, and approve
or reject.
6. Generate some reports, summary statistics, effects of change etc.
7. Publish results.
Each step is implemented as a 'wizard' step. The process boils down to defining a fairly large working set, applying some automation to narrow
the scope of the set, and ultimately scrutinizing and resolving the remaining small set of items by hand.
Each step has some checks that must pass before the user can continue,
and moving backwards invalidates subsequent steps (re-applying the rules erases the hand-tuned changes in the next step).
Because the rules are applied to a large set, either SQL or eviction is used
to process the items efficiently. Because the subset of failed items may be large, paging is used to navigate the lazy collection.
The question is how to synchronize the bulk changes with the conventional hibernate collections, and what is a suitable session strategy? One-per desktop application instance, or one per step?
|