-->
These old forums are deprecated now and set to read-only. We are waiting for you on our new forums!
More modern, Discourse-based and with GitHub/Google/Twitter authentication built-in.

All times are UTC - 5 hours [ DST ]



Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 5 posts ] 
Author Message
 Post subject: Public Draft Feedback
PostPosted: Thu Feb 12, 2009 3:27 pm 
Newbie

Joined: Thu Apr 15, 2004 5:53 pm
Posts: 13
Sorry, it's long, so it's taken me some time to get this in.

General Issues
==============
These are cross-cutting concerns, rather than specific to a particular item in the document for the most part.

Bootstrapping
-------------
My biggest issue with the specification is in the bootstrapping. First and foremost, it doesn't match how I'd likely want to use the API, and it's a lot of extra weight to the spec. The bootstrapping mechanism could vary significantly from provider to provider, and it seems like having standard bootstrapping is less important.

Further, this standardized bootstrapping approach has been used in other APIs in the past, notably XML parsers, and I think it's caused as many problems as it's solved. It's also confusing. Will new users really understand the differences between ValidatorFactory, ValidatorBuilder, ValidatorFactoryBuilder, ValidationProvider, ValidationProviderResolver, ValidationFactoryConfiguration, etc.

If I were writing a Validator implementation, I'd probably be assuming that my validators, or at least my validator factories, could be injected (by Spring, by the container, by the new Java Contexts and Dependency Injection). The method of assembling the validator factory and/or the validator then becomes largely irrelevant, and being forced to do it a particular way seems a little silly.

If I were to fold validation into Spring, for instance, I'd want a simple set of beans that could be assembled to spit out a validator, a Valdator Factory Bean that could be optionally configured with a message resolver and traversal strategy, and/or a particular context of beans to validate, and that'd be enough.

I could live with an optional bootstrapping that could be used, but I don't think it's necessary as part of the core specification, and as a potential consumer of the API, this isn't how I'd choose to bootstrap validation if I had a choice.

It would also reduce the weight of the specification somewhat, as it might not be necessary to specify interfaces for a number of things that are currently defined.

This specification feels as if it's destined to appear in literature for things like Spring and Grails about how using technology <X> makes it easier to use Java Bean Validation without all the providers and resolvers. :/

The crowning moment is on Page 64, "GenericBuilderFactory builds a generic ValidatorFactoryBuilder using the first ValidationProvider returned by ValidationProviderResolution and calling ValidatorFactoryBuilder<?> createGenericValidatorFactoryBuilder(BootstrapState state). BootstrapState holds the ValidationProviderResolution instance passed to GenericBuilderFactory and will be used by the ValidatorFactoryBuilder instance when resolving the provider to use." This reminded me of: Why I Hate Frameworks (http://discuss.joelonsoftware.com/defau ... .219431.12)


Multiple Validation Implementations
-----------------------------------
This spec doesn't seem to address the idea that more than one validation implementation could be used in a particular project. In particular, if one consumes a library which uses validation and you use a different validation library, will the bootstrapping support this kind of use?

Type Safety
-----------
Contraints are likely to be applicable to some subset of the overall type system in Java. For instance, a generalized @Max constraint is unlikely to know how to apply itself to a project-specific Customer class.

It doesn't sem as if the spec addresses this in any great detail. Am I forgetting something? What's the suggested approach, to simply assume that a constraint is valid if you don't know how to apply it? Assume invalidity? Throw exceptions? It would be nice if the constraints themselves could trigger compile-time failures, but I don't see any easy way to make that happen, at least not without getting involved in the compile process, which seems like an implementation-specific decision.

Compatibility
-------------
Some of the parsing that may be necessary when you're dealing with multi-valued constraints, composite constraints, and overriden parameters, overlapping groups could be pretty heavy-weight and complicated; and there are a lot of implementation approaches from build-time code generation, to runtime bytecode generation to reflection (with caching). Is there some kind of TCK to ensure that implementors have managed to put all those things together in the right way (and, perhaps more importantly for portability, in the same way)?

Cycles
------
Nothing prevents constraints from being developed in cycles, including overriding parameters in cycles; presumably this is something that implementors should watch for?

Conversion/Canonicalization
---------------------------
Some validation systems involve canonicalization and conversion. For instance, converting an address to a certain form during the validation. From what I read in this spec, this JSR isn't meant to cover that sort of thing, and in fact, is almost forbidden in some areas (e.g. Constraint Validation implementation shouldn't change values). I'm ok with that, but I wouldn't mind seeing that expressly called out somewhere.

Ordering
--------
Although it's difficult when working with large specs like this to only reference things that you've already talked about or to always make forward references when you need to, I'd say there are some elements where I ran into a term or a class name and had to mentally file it away as something I didn't yet understand, and which were explained later. For instance, on Page 13, there's a reference to 'unexpanded' messages, and while my guess about what that meant was correct, it was distracting to see a reference to something that the spec hadn't already talked about -- message expansion/resolution.

That example in particular is awkward because message expansion is also described as message interpolation and message resolution, so even when you catch up to the details, you're not sure if what you're reading now is the same as what was referred to before.

Similarly, the XML configuration file is mentioned in passing within section 4.4.1, although there's been no talk of an XML configuration file up to that point that I can recall.

Innovation by Specification
---------------------------
To be honest, I'd be happy just to see a reference implementation of this release and used in this form for a while before we lock it down as a standard -- it doesn't feel like something that's had enough public use yet to really know that we've found the right flex points. The spec authors may well disagree on this point, of course. ;)



Specific Issues
===============
Relating to a particular item in the spec, for the most part.

Orthoganal @NotNull
-------------------
Example 2.5 on page 8 implies using @NotNull as a composing constraint. Although the specification suggests this isn't a great approach in several other places, I'd personally prefer this example to change for that reason, because it's not generally a good idea to compse your constraint with @NotNull, and this example vaguely implies that. It's not hard to suggest alternate examples.

Frankly, the paragraph on Page 12 that talks about this could be moved forward, it feels out of place on Page 12, and it feels like it's missing on Page 8.

For that matter, on Page 14, talking about example 2.12, it would probably be nice to reference this point briefly in case anyone misses why the example returns true on a null parameter.

@ReportAsViolationFromCompositeConstraint
-----------------------------------------
Page 8, Example 2.6 and the preceding paragraph.

Awkward name for the annotation; I feel like there's gotta be a better name, although I haven't yet come up with something I'm happy with. It doesn't help that I don't find the Composing/Composite a very natural distinction, so something like @ReportComposingConstraintViolationsAsCompositeConstraintViolation seems pretty awkward still, as well as being significantly longer.

Does this apply to nested composite constraints? That is, if @FrenchZipCode is itself annotated with both @ReportAsViolationFromCompositeConstraint and @NotEmpty and, and @NotEmpty is annotated with @NotNull, does that mean that a violation of @NotNull would be reported as a violation of @NotEmpty? And what if @NotEmpty has @ReportAsViolationFromCompositeConstraint but @FrenchZipCode does not? Does the constraint violation get reported as a @NotEmpty?

What if you want to report some of the composing constraints as violation of the composite, but not all of them? In the example listed, what if you wanted @Size to be reported as a violation of @FrenchZipCode, but @NotNull to continue to be reported as @NotNull?

What happens if you're using this annotation and multiple composing constraints fail? Will each be reported as an individual violation of the composite?

Multi-Valued Constraints
------------------------
Page 7, example 2.4. Wouldn't this be as effective with a single multivaluedconstraint-holder? Instead of:
@Patterns( { @Pattern(regex="",message=""), @Pattern(regex="",pattern="") } )

This could be:
@Constraints( { @Pattern(regex="",message=""), @Pattern(regex="",pattern="") } )

The former will give typesafety, but I'm not sure that's all that valuable when compared to having to implement each multi-valued constraint container class.

@OverrideParameter
------------------
Page 10, Example 2.9.

I'm assuming it's not possible to override the parameter of a parameter? If composite constraint A has composing constraints B and C and constraint C is a composite with D and E, it's not possible for A to override a parameter of E directly?

I assume it is, however, possible to override a parameter that itself overrides a parameter; so if C overrides a parameter of D with a parameter of its own and A overrides that paramaeter of C, the end result will be that D uses the value specified on A?

Constraint Interface
--------------------
Page 11, 2.4.

There will be a cyclical dependency between the annotation and the constraint implementation in basically all cases, as I understand it? I don't see that as a problem, but it's vaguely awkward. It's possible to say:

@ConstraintValidator(FrenchZipCodeValidator.class)
public @interface FrenchZipCode { ... }

public class FrenchZipCodeValidator implements Constraint<AmericanZipCode> {
...
}

The lifecycle of the constraint class is undefined; does that mean that the need to be threadsafe is also undefined? It seems like it can't be up to the implementor of the constraint to make that decision. Presumably, the constraint should assume that initialize() won't be called with new values before isValid() is done? Would it be easier to pass the constraint annotation into the isValid() method, or does that constrain caching opportunities too much?

I don't believe the spec's talked about ConstraintFactory yet at this point, so it might be good to put in some kind of forward reference for someone who hasn't yet absorbed it.

The second paragraph on Page 12 talks about the 'value' parameter -- I assume that's referring to the first parameter, named in the method signature as "Object object"?

That same paragraph also mentions throwing an IllegalArgumentException -- what happens with that exception? Is it thrown all the way up to the calling code, or transformed in some way by the Validation API?

'Overridden' Properties
-----------------------
There's a section on Page 13 about overridden properties that doesn't make a lot of sense to me. I'm not sure if it refers to the @OverrideParameter annotation, or something else (since overriding methods, for instance, has meaning in Java). I can't even parse all of the bullet points, let alone understand the central meaning of that section. Perhaps I'm just dense, but so will other readers of the spec, so further clarification is probably required.

ConstraintFactory
-----------------
Page 15, Section 2.5. This feels too low-level to be in this section, to me; this feels like it belongs later in the document, although I haven't really formed an opinion about where it belongs.

Graph Validation
----------------
Why not work with Iterable as well? Most of the common iterables are covered by this list, but if I've written my own Iterable, it seems like I'd want to be able to apply validation. Practically speaking, I doubt this is much of a stumbling block, but it's also probably easy to implement.

What's the warning here about deterministic behavior? I'm not sure that I'm following. What's the alternate rule you're suggesting? What are the problems with determinism that you forsee?

Field and Property Validation
-----------------------------
Pages 16, 17; Section 3.1.2. It seems like describing 'field access strategy' right after you reference it, and doing the same for 'property access strategy' would be simpler than referencing each one and then describing each one.

Page 18, Section 3.2 overlaps with Section 3.1.2 somewhat; feels duplicated.

Validation in a Hierarchy
-------------------------
Page 17, Section 3.1.3: "ensures property polymorphism behavior" is awkward. "polymorphic behavior" is better; I might also consider something like, "ensures that validation works well with type hierarchies".

Page 18, Section 3.3 talks about the effect being cumulative. Might be good to explicitly call out the fact that there's no way to weaken the validation of a superclass (assuming that conclusion is correct, anyway).

Page 18 & 19, Example 3.2 shows a class with no superclass or interface, then adds a note to suggest that constraints held on superclasses and interfaces are considered. It's true, but it seems like the wrong time to point it out; maybe use an example with a superclass and/or interface?

Default Group
-------------
Should Default be DefaultGroup (note Javadoc for the Default interface)? Maybe. I come and go on that sort of thing -- packages provide a kind of scoping, but, then, many people don't see the packages that often in modern IDEs, so maybe it's helpful.

Group Sequences
---------------
Can sequences themselves be sequenced? That is, can you declare something like:

@GroupSequence( sequence = { Default.class, Persistence.class } )
public interface OnSave {}

@GroupSequence( sequence = { Persistence.class, DerivedFields.class, Archival.class } )
public interface OnArchive {}

Seems potentially useful, but it creates a whole host of other problems, notably cycles, and ordering concerns when items are duplicated, that would require strict rules.

And what happens if an interface with a @GroupSequence annotation has superinterfaces that are themselves listed as groups for a class?

Formal Definitions
------------------
Page 24, "every constraint declared by the class x which does not explicitly declare a group or declare the group Default explicitly" sounds like it's excluding constraints which declare the group Default. It that what is intended?

I feel like the formal definitions don't do a strong job of separating the Default.class group from the possibility that a class redefines the default sequence. It says that the group Default contains every constraint belonging to every group declared by @GroupSequence. Since @GroupSequence must declare Default, that implies a kind of recursive definition.

What happens if a class implements an interface that is annotated with @GroupSequence; no constraints on that interface will be picked up? These kinds of concerns are where using interfaces as constraint groups get a little messy, and start creating as many problems as they solve.

Validation Routine
------------------
Traversable starts to show up here without really having been mentioned before. Are these bullet points run in any order, or is the order irrelevant? The last sentence on this page talks about "previously defined order", and I'm not sure what it's referring to, unless it's the bulleted list. What do you mean by "routine loop call?"

Object Graph Validation
-----------------------
Page 26: Although it's not explicitly called out, I'm assuming that object graph validation should be breadth-first to avoid ending up with property paths longer than they should be?

For instance, if were validating an order, which has a user and a shipping address, and the shipping address has a user, and I wanted to validate all those objects together, I'd want any errors on the user to show up as "user" rather than "shippingAddress.user". With a larger graph, you can imagine this could easily reach ludicrous examples.

What's the intersection between @Valid graph validation and @GroupSequence? Can @Valid have groups? If so, with these classes:

@GroupSequence( sequence = { Default.class, Extra.class } )
public class Order {

@NotNull
String name;

@Valid(groups=Graph.class)
OrderLine[] lines;

}

@GroupSequence( sequence = { Default.class, Extra.class } )
public class OrderLine {

@NotNull(groups=Extra.class)
String name;

@Min(value=0,exclusive=true,groups=Default.class)
float price;

}

What will be validated, and in what order, if validating Order with the default sequence?

JPA Traversal
-------------
I don't know that traversing lazily-loaded relationships is necessarily always undesired behavior as implied on Page 26. I can certainly imagine that it might be much of the time, but I can also imagine that there might be scenarios when you'd prefer strong validation even if it means loading some more data from the persistent datastore.

@Length and @Size
-----------------
Although the standard annotations seem to use @Size for string length, there are several examples that use @Length. While I realize that it could be a custom annotation, it seems like using a similar name for a similar concept that is not a standard annotation is probably unnecessarily confusing.

Validator API
-------------
When you invoke validate with multiple groups, is there an ordering implied, or is it simply that all of the groups must be validated, in no particular order, possibly in parallel?

validateValue needs a @param beanType in the Javadoc.

Right after the code sample at the top of Page 37 in 4.1, there's a line about "getConstraintsForClass" being described elsewhere. This feels out or place - as all of those methods are described elsewhere. Perhaps move this to the end of 4.1.1?

In section 4.1.1, it talks about validating properties; I'm assuming this would be true for both field-level and getter-level constraints and values?

If you made a call to validate() with two sequences that had some overlapping groups, how would these be sequenced, or is that entirely up to the implementation? For instance:

public class Order {

@Min(value=0,exclusive=true,group=Base.class)
BigDecimal orderTotal;

@MaxFloat(value=0.5,group=Business.class)

@Min(value=1,group=Associations.class)
OrderLine[] lines;

}

@GroupSequence( sequence = { Base.class, Business.class } )
public interface BaseAndBusiness.class {}

@GroupSequence( sequence = { Business.class, Associations.class} )
public interface BusinessAndAssociations {}

This is a somewhat contrived example, but if I then call:
validator.validate(order,BusinessAndAssociationsAssociations.class, BaseAndBusiness.class)

Should I expect that Base will definitely be validated before business, in that case, because that's my desired sequence, or will the fact that Business also shows up in an unrelated sequence potentilaly throw it off?

Trying to retain sequences seems appropriate, but it also potentially creates some really ugly ordering problems with complex sequences, including "deadlocks" where two sequences order the same groups in opposite order, and a user tries to validate both sequences.

Constraint Violation
--------------------
4.2, Page 41, source code: Seems like it might be useful to be able to get the property name without driving it from the property path, although I can't think of an immediate use case where it would be critical, and it can certainly be derived. Similarly, getGroups() which is marked as potentially removeable, I can't think of an immediate use case for, although I can imagine it might be useful.

When talking about getPropertyPath() and association traversal on Page 42, there's no mention of how Collection and Set traversal paths would be calculated. Are the rules complete, such that Order with lines property of Set<OrderLine> might end up with a constraint violation path of, say, "lines.name" that doesn't attempt to identify which line is at fault, or is this section not complete?

On Page 42, where it says:
If the propertyPath is empty, "" is returned.

I'm assuming this would happen on a class-level constraint on the root object being validated? Might be worth explaining the likely scenario rather than simply talking about the empty property path?

Page 43, by the time you get around to talking about "groups" and "getConstraintDescriptor", I've forgotten that we were walking through the list of methods on the interface, because there's been almost two pages about property paths. Probably simpler to either move the property path portion to the end of the section or find some way of breaking out the details about property path calculation into its own subsection.

Message Resolution
------------------
Section 4.3.1.1, Page 46: while this offers a fair amount of flexibility, there are still a few things I'd personally want to see. There's currently no way to pull in the property name. Message expansion doesn't allow the message to vary at all based on pluralization. There doesn't seem to be a way to pull in Locale formatting or other formatters, or to dig into property values, or derived values that might need to come from a method call. There are definitely diminishing returns here, and some of these could easily be provided using a custom message resolution class, but the more of this you push to custom classes, the harder you make it to develop a set of custom constraints with good portable default messages.

It's also not clear if there's a standard format for the message -- is this usually a sentence fragment that might follow a property name, e.g. "cannot be null", or an entire sentence, e.g. "The supplied credit card number is not valid; please re-enter or verify the number."

It seems like the placement of the statement that message resolvers need to be threadsafe is fairly random within section 4.3.2; I'd probably push it up higher, or lower.

Although I now understand (after reading later sections) the way that message resolvers are overridden using defineValidatorState(), claiming that it's done with a call that ends with getValidator() was confusing at the time. More clarity would be possible witih a little more explanation.

Bootstrapping
-------------
I wouldn't mind understanding better how BootStrapping is intended to achieve 'type safety' (4.4, page 49).

Would it make more sense to pass things like message resolvers into a particular validation invocation rather than overriding it for a particular validator? Or are you expecting that one might often hang on to a validator with a non-default message resolver and use it over and over?

Validator Factory / Builder
---------------------------
I'm curious how one imagines the getMessageResolver() method on ValidatorFactory being used, given that there's already a getDefaultMessageResolver() method on ValidatorFactoryBuilder.

How do you envision The method on Validator Factory geto the the Message Resolver seems unnecessary. You can already get the default message resolver from the validator factory builder. And maybe I could see the need to get one from

What happens if ValidatorFactoryBuilder.build() is called without calling configure() or passing in TraversableResolver and MessageResolver?

Do we really need to describe typical implementations of ValidatorFactoryBuilder in the spec (p.56)? This seems like an implementation suggestion to potential implementors rather than part of the specification.

ValidationProvider
------------------

Am I reading correctly that an implementor writing a ValidationProvider implementation would need to support returning a different provider's ValidatorFactoryBuilder via createGenericValidatorFactoryBuilder()? It seems like it based on page 65, which says "The bootstrap implementation must ensure it can bootstrap third party providers." Is there going to be some sort of TCK to verify this?

BootstrapState currently only contains a provider resolver; wouldn't it be simpler just to pass in the provider resolver? Is this assuming that there will be other elements to BootstrapState in the future?

Descriptors
-----------
It's not clear from the Javadoc that BeanDescriptor.getConstraintDescriptors() would not return the property descriptors. That becomes evident somewhere else.

Why no hasConstraints() on BeanDescriptor?

Why no way to traverse from the PropertyDescriptor to the cascaded type? Seems like it might be convenient to be able to call something like getCascadedBeanDescriptor() if isCascaded() returns true.

Might be nice if ConstraintDescriptor supported isComposingConstraint().

Example on Page 71, Section 5.6 - The Author superclass is irrelevant and doesn't really need to appear in the example. Although if you took advantage of the author class to show the traversal, it would show why it'd be nice to have something like getCascadedBeanDescriptor().

Built-In Constraints
--------------------
Seems odd to use the 'assert' style for true/false but not for anything else. Would it be better just to have @BooleanTrue and @BooleanFalse with messages that aren't so assertion-oriented? Seems more viable for validation to me.

@Min and @Max must be long integers? What about floats? BigDecimals? Will we need separate validation classes for these, and if so, why don't they appear here? How would String work with Min/Max? What methods would you call on BigDecimal, BigInteger, Number and String for the comparison? Do we need inclusive/exclusive handling?

I assume @Digits means maximum number of digits? Why not @MaxDigits?

Do @Past and @Future need special handling for java.sql.date? Should they have inclusive/exclusive handling? Wouldn't these be better as specializations of @After and @Before?

Suggested options: @NoWhitespace, @Percentage (0-100 or 0-1), @NonZero, @Positive, @Negative. A @RegExp option would be pretty generic, but useful to cover gaps.

XML Descriptor
--------------
I'd be interested to see what this would look like and how it interacts with the annotations.

Method-Level Validation
-----------------------
I can see some merit in this, but I don't think it should be tied to this spec.

JPA Integration
---------------
If a given constraint (e.g. @Unique) isn't useful for BV, don't add it; JPA can handle it.

It'd be interesting to understand the suggested application of @Min/@Max, @Future/@Past, @Size in terms of db schema generation.

If there is no @NotNull bean validation constraint, should a not-null constraint in the database be reported through Bean Validation? Seems like dangerous ground to be on. It might not be a validation failure, it might be a problem with the database, and potentially reporting it to the user through generic validation-handling code seems awkward.

"Lazy properties or associations must not be traversed" -- even if already loaded?

"Single or multi-valued associations pointing to entiries must not be traversed"? Why's that?

Syntax - Wording, Typos, Minor Points
=====================================
Little things in the document itself rather than with the content. Free proofreading.

Page iv, Disclaimer and p.3, 1.4: "feedbacks" should be "feedback"

Page 16, second paragraph: "Constraints are declared for classes" feels awkward to me. They're declared on classes, variables and methods, they're part of the class definition ... I'm not sure that I've got a better revised phrasing in mind, but it does feel a little awkward.

Page 16, Section 3.1: It wouldn't hurt for the reference to the JavaBeans specification to supply a reference, for those who aren't familiar with it in detail, or to summarize some of the key points. That said, I hope most of the people who are likely to read this spec will know what you're talking about.

Page 20, section 3.4.1: "validating the group BuyInOneClick" wil llead to the following constraints checking because Default and Billable are subinterfaces of BuyInOneClick". Don't you mean superinterfaces here? If they're subinterfaces, then I would expect the constraints you describe not to be checked.

Page 23: I like the idea of implicit groups based on super-interface/class; it seems both useful and relatively elegant.

Page 26, TraversableResolver interface-level Javadoc: "must me thread-safe" is presumably "must be thread-safe".

Page 30, "getCity is passed to the Length" should presumably be "getCity is passed to the @Length"

Page 36, validateValue says "validate all constraints on propertyName property if the property value is value". I'm assuming you meant something more like "as if the property value were value", or "Validate all constraints on propertyName property against a value of value". As written it almost implies a conditional validation.

The examples at the end of 4.1.1 on Page 38 won't return ConstraintViolation objects, because there are assertions. That's mostly pedantic; most people will understand exactly what you mean, so it's not a big deal.

Page 39, Section 4.1.2 says "Groups allow you to restrict the set of constraints", but, compared to the default constraints, groups also allow you to expand or substitute the constraints, not simply restrict them. The default set isn't necessarily the super-set.

Example 4.1.2.1 on Page 39 has constraints on the city property which don't show up in the subsequent explanation.

4.3.3 says the validation properties are shown in Table 4.2, but an anonymous code block with the properties follows, and Table 4.2 shows the message interpolation.

4.4.3 talks about message resolver and traversable resolver 'strategy'. Mostly these classes are mentioned without referring to them as strategies, and although the usage seems fine, it's confusing to use one term here but not use that term elsewhere.

The ValidatorFactoryBuilder code on page 55 says, on the Javadoc for #configure(InputStream): "If not specified, META-INF/validation.xml is used". What does not-specified mean here? Would you pass in a null if you wanted to use validation.xml?

ValidationFactoryBuilder.getDefaultMessageResolver, Page 55: "MessageREsolver" should presumably be "MessageResolver".

"XMl" on page 57 should probably be "XML". The URL on page 58 has a line break and hyphen in it; should probably be made non-breaking for clarity.

Validation javadoc, page 63: "exsist" should be "exist" (twice). "&lt?&gt;" should be "&lt;?&gt;"

Page 68, 5.3 - "evaludated" should be "evaluated". Again on Page 77.

_________________
--
Geoffrey Wiseman


Top
 Profile  
 
 Post subject: Re: Public Draft Feedback
PostPosted: Fri Feb 13, 2009 8:57 am 
Hibernate Team
Hibernate Team

Joined: Sun Sep 14, 2003 3:54 am
Posts: 7256
Location: Paris, France
Hi thanks for your feedback

diathesis wrote:
Bootstrapping
-------------
My biggest issue with the specification is in the bootstrapping. First and foremost, it doesn't match how I'd likely want to use the API, and it's a lot of extra weight to the spec. The bootstrapping mechanism could vary significantly from provider to provider, and it seems like having standard bootstrapping is less important.

What bootstrapping difference do yo foresee? Would JPA have been that useful if the bootstrap API was not standardized?

Quote:
Further, this standardized bootstrapping approach has been used in other APIs in the past, notably XML parsers, and I think it's caused as many problems as it's solved. It's also confusing. Will new users really understand the differences between ValidatorFactory, ValidatorBuilder, ValidatorFactoryBuilder, ValidationProvider, ValidationProviderResolver, ValidationFactoryConfiguration, etc.

the bootstrapping provides several functions:
- support for multiple vendors
- a programmatic way to configure validators (on top of the XML form)

Also, I did a renaming in the latest version which I believe clarify a lot the meaning of the bootstrapping API. Check out the spec here http://in.relation.to/10679.lace

ValidatorFactory => ValidatorFactory
ValidatorBuilder => ValidatorContext
ValidatorFactoryBuilder => Configuration

ValidationProvider, ValidationProviderResolver, ValidationFactoryConfiguration (ConfigurationState now) are usually not visible to the user.

Quote:
If I were writing a Validator implementation, I'd probably be assuming that my validators, or at least my validator factories, could be injected (by Spring, by the container, by the new Java Contexts and Dependency Injection). The method of assembling the validator factory and/or the validator then becomes largely irrelevant, and being forced to do it a particular way seems a little silly.


Right, but Spring, EE, Java Contexts and Dependency Injection, Tapestry, JPA, JSF2, Wicket etc etc do need to bootstrap Validator :)
BTW EE will consider VF and V as resources and treat them as injectable

Quote:
If I were to fold validation into Spring, for instance, I'd want a simple set of beans that could be assembled to spit out a validator, a Valdator Factory Bean that could be optionally configured with a message resolver and traversal strategy, and/or a particular context of beans to validate, and that'd be enough.


So the API is needed by Spring in this case.

Quote:
I could live with an optional bootstrapping that could be used, but I don't think it's necessary as part of the core specification, and as a potential consumer of the API, this isn't how I'd choose to bootstrap validation if I had a choice.

It would also reduce the weight of the specification somewhat, as it might not be necessary to specify interfaces for a number of things that are currently defined.

This specification feels as if it's destined to appear in literature for things like Spring and Grails about how using technology <X> makes it easier to use Java Bean Validation without all the providers and resolvers. :/


Once again, either someone boots it for you but if you are the one booting validation, you need an API.

Quote:
The crowning moment is on Page 64, "GenericBuilderFactory builds a generic ValidatorFactoryBuilder using the first ValidationProvider returned by ValidationProviderResolution and calling ValidatorFactoryBuilder<?> createGenericValidatorFactoryBuilder(BootstrapState state). BootstrapState holds the ValidationProviderResolution instance passed to GenericBuilderFactory and will be used by the ValidatorFactoryBuilder instance when resolving the provider to use." This reminded me of: Why I Hate Frameworks (http://discuss.joelonsoftware.com/defau ... .219431.12)

Yes the bootstrap internals are hard to understand for implementors. But I don't care if a BV provider has to work hard.
On the other hand, the boot API is trivial to use and typesafe.
Code:
//a rather complex example
ValidatorFactory factory = Validation
       .byDefaultProvider().configure()
          .messageInterpolator( new ContainerMessageInterpolator() )
          .constraintValidatorFactory( new ontainerComponentConstraintValidatorFactory() )
          .traversableResolver( new JPAAwareTraversableResolver() )
          .buildValidatorFactory();



Quote:
This spec doesn't seem to address the idea that more than one validation implementation could be used in a particular project. In particular, if one consumes a library which uses validation and you use a different validation library, will the bootstrapping support this kind of use?


Yes, you will create two different ValidatorFactory (more precisely, the library build it's own VF and your app builds its own).

Quote:
Type Safety
-----------
Contraints are likely to be applicable to some subset of the overall type system in Java. For instance, a generalized @Max constraint is unlikely to know how to apply itself to a project-specific Customer class.

It doesn't sem as if the spec addresses this in any great detail. Am I forgetting something? What's the suggested approach, to simply assume that a constraint is valid if you don't know how to apply it? Assume invalidity? Throw exceptions? It would be nice if the constraints themselves could trigger compile-time failures, but I don't see any easy way to make that happen, at least not without getting involved in the compile process, which seems like an implementation-specific decision.


Check the latest version of the spec http://in.relation.to/10679.lace specifically section 2.4. We have introduced the notion of type-safe constraint validator and multiple validators per constraints.

Quote:
Compatibility
-------------
Some of the parsing that may be necessary when you're dealing with multi-valued constraints, composite constraints, and overriden parameters, overlapping groups could be pretty heavy-weight and complicated; and there are a lot of implementation approaches from build-time code generation, to runtime bytecode generation to reflection (with caching). Is there some kind of TCK to ensure that implementors have managed to put all those things together in the right way (and, perhaps more importantly for portability, in the same way)?


Yes as every other JSR, this JSR comes up with a Reference Implementation and a TCK. The TCK is in early stage at the moment though.

Quote:
Cycles
------
Nothing prevents constraints from being developed in cycles, including overriding parameters in cycles; presumably this is something that implementors should watch for?

Good point, I don't think params are the problem but the same constraint multiple times in the composition hierarchy is a problem.

Quote:
Conversion/Canonicalization
---------------------------
Some validation systems involve canonicalization and conversion. For instance, converting an address to a certain form during the validation. From what I read in this spec, this JSR isn't meant to cover that sort of thing, and in fact, is almost forbidden in some areas (e.g. Constraint Validation implementation shouldn't change values). I'm ok with that, but I wouldn't mind seeing that expressly called out somewhere.

yes the expert group was against mixing conversion and validation semantics.

Quote:
Ordering
--------
Although it's difficult when working with large specs like this to only reference things that you've already talked about or to always make forward references when you need to, I'd say there are some elements where I ran into a term or a class name and had to mentally file it away as something I didn't yet understand, and which were explained later. For instance, on Page 13, there's a reference to 'unexpanded' messages, and while my guess about what that meant was correct, it was distracting to see a reference to something that the spec hadn't already talked about -- message expansion/resolution.

I know but it's really hard to do wo it for a spec (this is much easier in a book).

Quote:
That example in particular is awkward because message expansion is also described as message interpolation and message resolution, so even when you catch up to the details, you're not sure if what you're reading now is the same as what was referred to before.

This was a left over, fixed now.

Quote:
Innovation by Specification
---------------------------
To be honest, I'd be happy just to see a reference implementation of this release and used in this form for a while before we lock it down as a standard -- it doesn't feel like something that's had enough public use yet to really know that we've found the right flex points. The spec authors may well disagree on this point, of course. ;)


Yep, we disagree. Hibernate Validator, OVal and so on have had a lot of exposure already.

Ok I will do a break here for now :)

_________________
Emmanuel


Top
 Profile  
 
 Post subject: Re: Public Draft Feedback
PostPosted: Fri Feb 13, 2009 12:27 pm 
Newbie

Joined: Thu Apr 15, 2004 5:53 pm
Posts: 13
emmanuel wrote:
Hi thanks for your feedback


Thanks for responding quickly. I'm wondering if I should start breaking these out into their own threads if they're likely to form discussions.

emmanuel wrote:
What bootstrapping difference do yo foresee? Would JPA have been that useful if the bootstrap API was not standardized?


I don't disagree that each implementation would need to bootstrap in some way; it just feels like the API styles for bootstrapping could vary so widely that having a standard one, is going to be a little difficult.

And in particular, the style of bootstrapping used in the spec is not the style that I'd want to use personally, and one that's been problematic in the past. If I were writing a validation implementation, then I'd be a little irritated by having to implement a bootstrap approach that I'd rather not use.

Quote:
Also, I did a renaming in the latest version which I believe clarify a lot the meaning of the bootstrapping API. Check out the spec here http://in.relation.to/10679.lace


Yes, I haven't had a chance to see the updates yet; I'll try and take a look soon.

Quote:
Right, but Spring, EE, Java Contexts and Dependency Injection, Tapestry, JPA, JSF2, Wicket etc etc do need to bootstrap Validator :)
BTW EE will consider VF and V as resources and treat them as injectable


It's true -- there needs to be some way to get the configuration in place. That said, if the standard approach adds specific implementation constraints and is not something that's ever seen by the end-user, then I'm not sure it's all that useful. Looking at XML parsers in Java, for instance, I think I'd rather have a provider-specific bootstrap than deal with all the complications that come from having different applications make assumptions about parsers that don't hold true in specific environments, or needing to have a specific parser but not having an easy way of getting it.

Maybe an example will help. I'll do a Spring example simply because that's what I'd be most likely to use. I'd probably end up with something like this:

Code:
<bean id="myService" class="com.mycompany.myproject.services.MyService">
  <constructor-arg index="0" ref="validator" />
</bean>

<!-- Alternately, perhaps this would be a context factory that creates a validator -->
<bean id="validator" class="com.mycompany.validator.Validator">
  <property name="messageResolver">
    <bean class="com.mycompany.validator.messages.DatabaseMessageResolver">
      <property name="url" "${validation.database.url}" />
      <property name="driver" "${validation.database.driver}" />
      <property name="username" "${validation.database.username}" />
      <property name="password" "${validation.database.password}" />
    </bean> 
  </property>
  <!-- default traversal is fine -->
</bean>



Given that kind of an example, the implementation approach to accomplish that bootstrap validation could be much easier or at least very different from what is described in the spec. As a result, implementing the spec's approach if I believe that the userbase will mostly be using my Spring implementation is somewhat off-putting. I don't, in this scenario, really want to bootstrap any provider, I just want a reasonable way of getting a validator injected into my application at the right spot. I'm not sure the Bootstrap API really helps me make that happen.

Does that help to clarify what I'm saying?

a large number of interfaces

Quote:
If I were to fold validation into Spring, for instance, I'd want a simple set of beans that could be assembled to spit out a validator, a Valdator Factory Bean that could be optionally configured with a message resolver and traversal strategy, and/or a particular context of beans to validate, and that'd be enough.


So the API is needed by Spring in this case.

Quote:
I could live with an optional bootstrapping that could be used, but I don't think it's necessary as part of the core specification, and as a potential consumer of the API, this isn't how I'd choose to bootstrap validation if I had a choice.

It would also reduce the weight of the specification somewhat, as it might not be necessary to specify interfaces for a number of things that are currently defined.

This specification feels as if it's destined to appear in literature for things like Spring and Grails about how using technology <X> makes it easier to use Java Bean Validation without all the providers and resolvers. :/


Once again, either someone boots it for you but if you are the one booting validation, you need an API.

Quote:
Yes the bootstrap internals are hard to understand for implementors. But I don't care if a BV provider has to work hard.


I agree that much of that complexity will be hidden from the end user. Then again, if the end user isn't likely to use the bootstrap API directly, then I'm not sure that it needs to be specified and made common. If the user is likely to use the bootstrap API but not likely to need to understand more than how to use it, I buy your point ... but I question the assumption that the user will be using this bootstrap API directly. I probably wouldn't.

Quote:
On the other hand, the boot API is trivial to use and typesafe.
Code:
//a rather complex example
ValidatorFactory factory = Validation
       .byDefaultProvider().configure()
          .messageInterpolator( new ContainerMessageInterpolator() )
          .constraintValidatorFactory( new ontainerComponentConstraintValidatorFactory() )
          .traversableResolver( new JPAAwareTraversableResolver() )
          .buildValidatorFactory();



It's not horrible, particularly since the assumption is you're doing that somewhere and caching it. However, the means by which people construct long-lived objects and cache them varies so much. Anyway -- I'm not sure we'll agree on this point, but I hope I've at least been able to make my point clear enough that you understand what I'm saying.

Quote:
Yes, you will create two different ValidatorFactory (more precisely, the library build it's own VF and your app builds its own).


Unless each of you specifies the provider, isn't there a risk that each will have depended their own provider, and one will get a provider it didn't expect? Or are you inclined to think the providers will be so interoperable that it won't matter? (I'm not convinced of the latter, but I'd be willing to be proven wrong.)

Quote:
Check the latest version of the spec http://in.relation.to/10679.lace specifically section 2.4. We have introduced the notion of type-safe constraint validator and multiple validators per constraints.
Quote:

I'll take a look.

Quote:
yes the expert group was against mixing conversion and validation semantics.
Quote:

I've found them convenient when they're grouped, but I suspect that even if grouping were the right long-term approach, it's better to start simpler.

Quote:
I know but it's really hard to do wo it for a spec (this is much easier in a book).


It's true. Forward-references are probably the simplest approach, although even that's somewhat painful.

Quote:
Yep, we disagree. Hibernate Validator, OVal and so on have had a lot of exposure already.


Although I've used Hibernate Validator and apparently at one point was aware of OVal, I've seen a wide array of projects that don't integrate any of these. I'd prefer to see adoption before specification, but it's possible that standard will help adoption. I guess I'm wary of specs that don't seem to be demanded by the user-base, and I don't get the sense of a pent-up demand for a spec.

Maybe if we took a step back -- what's the desired goal of the spec, to increase interoperability? Increase adoption?

Quote:
Ok I will do a break here for now :)


Understandable.

_________________
--
Geoffrey Wiseman


Top
 Profile  
 
 Post subject: Re: Public Draft Feedback
PostPosted: Mon Feb 16, 2009 11:27 am 
Hibernate Team
Hibernate Team

Joined: Sun Sep 14, 2003 3:54 am
Posts: 7256
Location: Paris, France
Quote:
Orthoganal @NotNull
-------------------
Example 2.5 on page 8 implies using @NotNull as a composing constraint. Although the specification suggests this isn't a great approach in several other places, I'd personally prefer this example to change for that reason, because it's not generally a good idea to compse your constraint with @NotNull, and this example vaguely implies that. It's not hard to suggest alternate examples.

Frankly, the paragraph on Page 12 that talks about this could be moved forward, it feels out of place on Page 12, and it feels like it's missing on Page 8.

For that matter, on Page 14, talking about example 2.12, it would probably be nice to reference this point briefly in case anyone misses why the example returns true on a null parameter.


Added to my todo list

Quote:
@ReportAsViolationFromCompositeConstraint
-----------------------------------------
Page 8, Example 2.6 and the preceding paragraph.

Awkward name for the annotation; I feel like there's gotta be a better name, although I haven't yet come up with something I'm happy with. It doesn't help that I don't find the Composing/Composite a very natural distinction, so something like @ReportComposingConstraintViolationsAsCompositeConstraintViolation seems pretty awkward still, as well as being significantly longer.


We've renamed it @ReportAsSingleViolation

Quote:
Does this apply to nested composite constraints? That is, if @FrenchZipCode is itself annotated with both @ReportAsViolationFromCompositeConstraint and @NotEmpty and, and @NotEmpty is annotated with @NotNull, does that mean that a violation of @NotNull would be reported as a violation of @NotEmpty? And what if @NotEmpty has @ReportAsViolationFromCompositeConstraint but @FrenchZipCode does not? Does the constraint violation get reported as a @NotEmpty?

1. in this case, a @NotNull violation is reported as a @FrenchZipcode violation

2. in this case the error is reported as @ NotEmpty

Quote:
What if you want to report some of the composing constraints as violation of the composite, but not all of them? In the example listed, what if you wanted @Size to be reported as a violation of @FrenchZipCode, but @NotNull to continue to be reported as @NotNull?


Hum, do you have an elegant syntax in your hat? :)

Quote:
What happens if you're using this annotation and multiple composing constraints fail? Will each be reported as an individual violation of the composite?

only one violation is reported for all.


Quote:
This could be:
@Constraints( { @Pattern(regex="",message=""), @Pattern(regex="",pattern="") } )


Try and compile that, it does not work in Java :(


Quote:
I'm assuming it's not possible to override the parameter of a parameter? If composite constraint A has composing constraints B and C and constraint C is a composite with D and E, it's not possible for A to override a parameter of E directly?

Only if C exposes the E parameter. Just like regular delegation in Java.

Quote:
I assume it is, however, possible to override a parameter that itself overrides a parameter; so if C overrides a parameter of D with a parameter of its own and A overrides that paramaeter of C, the end result will be that D uses the value specified on A?


yes

Quote:
There will be a cyclical dependency between the annotation and the constraint implementation in basically all cases, as I understand it? I don't see that as a problem, but it's vaguely awkward. It's possible to say:

@ConstraintValidator(FrenchZipCodeValidator.class)
public @interface FrenchZipCode { ... }

public class FrenchZipCodeValidator implements Constraint<AmericanZipCode> {
...
}

Year, I am not too fussed about cyclic deps. For the problem #2, this can be trapped by an annotation processor or an IDE.

Quote:
The lifecycle of the constraint class is undefined; does that mean that the need to be threadsafe is also undefined? It seems like it can't be up to the implementor of the constraint to make that decision. Presumably, the constraint should assume that initialize() won't be called with new values before isValid() is done? Would it be easier to pass the constraint annotation into the isValid() method, or does that constrain caching opportunities too much?


Interesting point I will try to clarify this part a bit.

Quote:
The second paragraph on Page 12 talks about the 'value' parameter -- I assume that's referring to the first parameter, named in the method signature as "Object object"?


fixed

Quote:
That same paragraph also mentions throwing an IllegalArgumentException -- what happens with that exception? Is it thrown all the way up to the calling code, or transformed in some way by the Validation API?


I am working on that.

Quote:
There's a section on Page 13 about overridden properties that doesn't make a lot of sense to me. I'm not sure if it refers to the @OverrideParameter annotation, or something else (since overriding methods, for instance, has meaning in Java). I can't even parse all of the bullet points, let alone understand the central meaning of that section. Perhaps I'm just dense, but so will other readers of the spec, so further clarification is probably required.

Will check that out

Quote:
Graph Validation
----------------
Why not work with Iterable as well? Most of the common iterables are covered by this list, but if I've written my own Iterable, it seems like I'd want to be able to apply validation. Practically speaking, I doubt this is much of a stumbling block, but it's also probably easy to implement.

I noted the point

Quote:
What's the warning here about deterministic behavior? I'm not sure that I'm following. What's the alternate rule you're suggesting? What are the problems with determinism that you forsee?

In the current algo
A-->B-->D
|->C-->D
We will validate A, B, D from B, C
or A, B, D from C, C
The propertyPath (constraint violation) can refer b or c
The alternative proposal would validate D twice because it is found in different branches (and hence will be deterministic).

Quote:
Page 17, Section 3.1.3: "ensures property polymorphism behavior" is awkward. "polymorphic behavior" is better; I might also consider something like, "ensures that validation works well with type hierarchies".

Yes I changed that already

Quote:
Page 18, Section 3.3 talks about the effect being cumulative. Might be good to explicitly call out the fact that there's no way to weaken the validation of a superclass (assuming that conclusion is correct, anyway).

Well if I have time there will be a way :)

Quote:
Default Group
-------------
Should Default be DefaultGroup (note Javadoc for the Default interface)? Maybe. I come and go on that sort of thing -- packages provide a kind of scoping, but, then, many people don't see the packages that often in modern IDEs, so maybe it's helpful.

Dunno, I don't like DefaultGroup very much. Anyway it's not used very often explicitly.


Quote:
Can sequences themselves be sequenced? That is, can you declare something like:

@GroupSequence( sequence = { Default.class, Persistence.class } )
public interface OnSave {}

@GroupSequence( sequence = { Persistence.class, DerivedFields.class, Archival.class } )
public interface OnArchive {}

Seems potentially useful, but it creates a whole host of other problems, notably cycles, and ordering concerns when items are duplicated, that would require strict rules.

This is being discussed indeed but I don't understand your example, OnSave or OnArchive are not referenced by the other sequence.

Quote:
And what happens if an interface with a @GroupSequence annotation has superinterfaces that are themselves listed as groups for a class?

Being discussed but I feel it should be allowed.

Quote:
Page 24, "every constraint declared by the class x which does not explicitly declare a group or declare the group Default explicitly" sounds like it's excluding constraints which declare the group Default. It that what is intended?

is now "every constraint declared by the class X which does not declare a group or does declare the group Default explicitly."

Quote:
I feel like the formal definitions don't do a strong job of separating the Default.class group from the possibility that a class redefines the default sequence. It says that the group Default contains every constraint belonging to every group declared by @GroupSequence. Since @GroupSequence must declare Default, that implies a kind of recursive definition.

A group sequence must declare the group X, not the group Default.


Quote:
What happens if a class implements an interface that is annotated with @GroupSequence; no constraints on that interface will be picked up? These kinds of concerns are where using interfaces as constraint groups get a little messy, and start creating as many problems as they solve.

Being discussed http://lists.jboss.org/pipermail/hibern ... 03632.html


Quote:
Traversable starts to show up here without really having been mentioned before. Are these bullet points run in any order, or is the order irrelevant? The last sentence on this page talks about "previously defined order", and I'm not sure what it's referring to, unless it's the bulleted list. What do you mean by "routine loop call?"


I clarified that. In essence there is no ordering.


Quote:
Page 26: Although it's not explicitly called out, I'm assuming that object graph validation should be breadth-first to avoid ending up with property paths longer than they should be?

For instance, if were validating an order, which has a user and a shipping address, and the shipping address has a user, and I wanted to validate all those objects together, I'd want any errors on the user to show up as "user" rather than "shippingAddress.user". With a larger graph, you can imagine this could easily reach ludicrous examples.

Interesting, this was not implied. I need to add that.

Quote:
What's the intersection between @Valid graph validation and @GroupSequence? Can @Valid have groups? If so, with these classes:

@GroupSequence( sequence = { Default.class, Extra.class } )
public class Order {

@NotNull
String name;

@Valid(groups=Graph.class)
OrderLine[] lines;

}

@GroupSequence( sequence = { Default.class, Extra.class } )
public class OrderLine {

@NotNull(groups=Extra.class)
String name;

@Min(value=0,exclusive=true,groups=Default.class)
float price;

.

What will be validated, and in what order, if validating Order with the default sequence?


No @Valid is orthogonal to groups.

Quote:
I don't know that traversing lazily-loaded relationships is necessarily always undesired behavior as implied on Page 26. I can certainly imagine that it might be much of the time, but I can also imagine that there might be scenarios when you'd prefer strong validation even if it means loading some more data from the persistent datastore.


Then provide your own TR implementation :)

Quote:
Although the standard annotations seem to use @Size for string length, there are several examples that use @Length. While I realize that it could be a custom annotation, it seems like using a similar name for a similar concept that is not a standard annotation is probably unnecessarily confusing.

In my todo list

Validator API
-------------
Quote:
When you invoke validate with multiple groups, is there an ordering implied, or is it simply that all of the groups must be validated, in no particular order, possibly in parallel?

no order, can be done in parallel. Generally speaking all groups can be done in parallel unless a sequence is defined.

Quote:
In section 4.1.1, it talks about validating properties; I'm assuming this would be true for both field-level and getter-level constraints and values?


the spec defines properties as both fields and getters.

Quote:
If you made a call to validate() with two sequences that had some overlapping groups, how would these be sequenced, or is that entirely up to the implementation? For instance:

public class Order {

@Min(value=0,exclusive=true,group=Base.class)
BigDecimal orderTotal;

@MaxFloat(value=0.5,group=Business.class)

@Min(value=1,group=Associations.class)
OrderLine[] lines;

}

@GroupSequence( sequence = { Base.class, Business.class } )
public interface BaseAndBusiness.class {}

@GroupSequence( sequence = { Business.class, Associations.class} )
public interface BusinessAndAssociations {}

This is a somewhat contrived example, but if I then call:
validator.validate(order,BusinessAndAssociationsAssociations.class, BaseAndBusiness.class)


This is something under-defined, we are working on polishing these rules (join hibernate-dev if you have opinions).


Quote:
4.2, Page 41, source code: Seems like it might be useful to be able to get the property name without driving it from the property path, although I can't think of an immediate use case where it would be critical, and it can certainly be derived. Similarly, getGroups() which is marked as potentially removeable, I can't think of an immediate use case for, although I can imagine it might be useful.

I noted the points

Quote:
When talking about getPropertyPath() and association traversal on Page 42, there's no mention of how Collection and Set traversal paths would be calculated. Are the rules complete, such that Order with lines property of Set<OrderLine> might end up with a constraint violation path of, say, "lines.name" that doesn't attempt to identify which line is at fault, or is this section not complete?


good point we probably should add [] for Set and Collection

On Page 42, where it says:
If the propertyPath is empty, "" is returned.

Quote:
I'm assuming this would happen on a class-level constraint on the root object being validated? Might be worth explaining the likely scenario rather than simply talking about the empty property path?

Done

Quote:
Page 43, by the time you get around to talking about "groups" and "getConstraintDescriptor", I've forgotten that we were walking through the list of methods on the interface, because there's been almost two pages about property paths. Probably simpler to either move the property path portion to the end of the section or find some way of breaking out the details about property path calculation into its own subsection.

Done

Quote:
Section 4.3.1.1, Page 46: while this offers a fair amount of flexibility, there are still a few things I'd personally want to see. There's currently no way to pull in the property name. Message expansion doesn't allow the message to vary at all based on pluralization. There doesn't seem to be a way to pull in Locale formatting or other formatters, or to dig into property values, or derived values that might need to come from a method call. There are definitely diminishing returns here, and some of these could easily be provided using a custom message resolution class, but the more of this you push to custom classes, the harder you make it to develop a set of custom constraints with good portable default messages.

What you are describing could mean a lot of work and considering the small amount of time left to finish the spec...

Quote:
Although I now understand (after reading later sections) the way that message resolvers are overridden using defineValidatorState(), claiming that it's done with a call that ends with getValidator() was confusing at the time. More clarity would be possible witih a little more explanation.

defineValidatorState() => usingContext() that makes it clearer I think.

Quote:
Would it make more sense to pass things like message resolvers into a particular validation invocation rather than overriding it for a particular validator? Or are you expecting that one might often hang on to a validator with a non-default message resolver and use it over and over?

Well, I am concerned about method overloading explosion. And It makes sense for a validator to be used for a given context.


Quote:
I'm curious how one imagines the getMessageResolver() method on ValidatorFactory being used, given that there's already a getDefaultMessageResolver() method on ValidatorFactoryBuilder.


How do you envision The method on Validator Factory geto the the Message Resolver seems unnecessary. You can already get the default message resolver from the validator factory builder. And maybe I could see the need to get one from

to override and alter a custom Message interpolator (say JSF defining the locale for your custom Message interpolator).

Quote:
What happens if ValidatorFactoryBuilder.build() is called without calling configure() or passing in TraversableResolver and MessageResolver?

Check the latest spec, all this has been clarified.


Quote:
Am I reading correctly that an implementor writing a ValidationProvider implementation would need to support returning a different provider's ValidatorFactoryBuilder via createGenericValidatorFactoryBuilder()? It seems like it based on page 65, which says "The bootstrap implementation must ensure it can bootstrap third party providers." Is there going to be some sort of TCK to verify this?

Yes and yes.

Quote:
BootstrapState currently only contains a provider resolver; wouldn't it be simpler just to pass in the provider resolver? Is this assuming that there will be other elements to BootstrapState in the future?

Openness for the future

Quote:
It's not clear from the Javadoc that BeanDescriptor.getConstraintDescriptors() would not return the property descriptors. That becomes evident somewhere else.


Not sure I follow.


Quote:
Why no hasConstraints() on BeanDescriptor?


it now has it :)

Quote:
Why no way to traverse from the PropertyDescriptor to the cascaded type? Seems like it might be convenient to be able to call something like getCascadedBeanDescriptor() if isCascaded() returns true.

I added it in the todo list but unless someone contribute that it's unlikely to happen.

Quote:
Might be nice if ConstraintDescriptor supported isComposingConstraint().

cd.getComposingConstraints().size() != 0

Quote:
Seems odd to use the 'assert' style for true/false but not for anything else. Would it be better just to have @BooleanTrue and @BooleanFalse with messages that aren't so assertion-oriented? Seems more viable for validation to me.
it's usually some kind of getter method computing something that is annotated with assertTrue / false. That makes more sense in this context.

Quote:
@Min and @Max must be long integers? What about floats? BigDecimals? Will we need separate validation classes for these, and if so, why don't they appear here?
How would String work with Min/Max? What methods would you call on BigDecimal, BigInteger, Number and String for the comparison? Do we need inclusive/exclusive handling?

Most of these questions are answered int he latest spec.
double vs int. I don't have a strong opinion.
same for inclusive / exclusive.
opinions?

Quote:
I assume @Digits means maximum number of digits? Why not @MaxDigits?

Not sure it adds a lot.

Quote:
Do @Past and @Future need special handling for java.sql.date? Should they have inclusive/exclusive handling? Wouldn't these be better as specializations of @After and @Before?


What do you mean?

Quote:
Suggested options: @NoWhitespace, @Percentage (0-100 or 0-1), @NonZero, @Positive, @Negative. A @RegExp option would be pretty generic, but useful to cover gaps.

@Pattern will likely make it. For the others, shoud they really be standard or just custom annotations. What value do they add in the standard?

Quote:
XML Descriptor
--------------
I'd be interested to see what this would look like and how it interacts with the annotations.


Check latest spec

Quote:
Method-Level Validation
-----------------------
I can see some merit in this, but I don't think it should be tied to this spec.


Looks like time will be on your side here.


Quote:
It'd be interesting to understand the suggested application of @Min/@Max, @Future/@Past, @Size in terms of db schema generation.

SQL supports the notion of column CHECK.

Quote:
"Lazy properties or associations must not be traversed" -- even if already loaded?

yes they will eventually be traversed if needed via flush.

Quote:
"Single or multi-valued associations pointing to entiries must not be traversed"? Why's that?

flush mode will do that already.


I think I am done, need a coffee now :)

_________________
Emmanuel


Top
 Profile  
 
 Post subject:
PostPosted: Mon Feb 16, 2009 11:35 am 
Hibernate Team
Hibernate Team

Joined: Sun Sep 14, 2003 3:54 am
Posts: 7256
Location: Paris, France
oh and yes, separated posts would be must easier to manage :)

_________________
Emmanuel


Top
 Profile  
 
Display posts from previous:  Sort by  
Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 5 posts ] 

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
© Copyright 2014, Red Hat Inc. All rights reserved. JBoss and Hibernate are registered trademarks and servicemarks of Red Hat, Inc.