-->
These old forums are deprecated now and set to read-only. We are waiting for you on our new forums!
More modern, Discourse-based and with GitHub/Google/Twitter authentication built-in.

All times are UTC - 5 hours [ DST ]



Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 1 post ] 
Author Message
 Post subject: Using validators on Pojos.
PostPosted: Fri Apr 25, 2008 6:40 pm 
Newbie

Joined: Fri Apr 25, 2008 6:17 pm
Posts: 1
Location: London
Here is an email reply to feedback Emmanuel Bernard gave me on the use of Validators on POJOs.

The short version is that I thought this should be possible and Emmanuel thought this had some merit.

The key thing I think would be needed is the ability to attach Constraint(s) to PARAMETERS.

Peter Lawrey writes in reply to Emmanuel Bernard

-------------------

I have added JSR-303 style runtime checks in our framework and have found them useful.

I prefer fail fast checking so the annotations are supported on constructor parameters, and setters by the bean factory class the framework supports. i.e. the caller is responsible for checking the constraints.

This caller is a library method buildAs(Class type, Map<String, Object> values) which will build any POJO or JavaBean from values given. It supports @ConstructorProperties, @PostConstruct and setters. It finds the constructor which will consume the most values and then calls the setters for the remaining. (Its a bit more complicated but that's the basic idea)

BTW: There is an addition method buildAs(Class type, Map<String, Object> required, Map<String, Object> defaults)
This method ensures all required values are consumed, and all defaults are set if they can be used by a constructor or setter.

The factory supports these constraints for constructor arguments and setters.

We also support constraints for RMI style calls. The arguments are checked against the constraints before the method is called.
The return value is also checked against the constraints.

>> Hi,
>>
>> As you will see I am not of the view that JavaBeans are the only way to program. Actually I think they are over used when there are simpler alternatives such as POJOs.
>> However I think the validation is useful and shouldn't be limited to a JavaBean view of the world but should be useful for POJOs as well.
>>
>> Kind Regards,
>> Peter Lawrey.
>> http://www.freshvanilla.org:8080/
>>
>> NOTES
>>
>> In Example 2.1, for completeness the @Target could include PARAMETER for the @NotNull annotation. IMHO this is the most useful use case which IntelliJ supports.
>>
>> The same would apply for Length, Min, Pattern. This raises the question, how did you imagine this would be associated with setters (to me this is the most obvious use case for fail fast validation)
>>
>> Is the intension to write
>> @NotNull String getValue();
>> @NotNull void setValue(String value);
>> Or
>> void setValue(@NotNull String value);
>>
>> Is it the intension that these validators can only be applied to JavaBean methods, but not validate other method parameters.
>>
>> For example, These would not be allowed... ???
>> @NotNull String setValue(String value);
>> @NotNull void setDimensions(Number width, Number height);
>>
>> Or could these be allowed?
>> @Nullable String setValue(@NotNull String value);
>> void setDimensions(@NotNull Number width, @NotNull Number height);

>
> What you are describing is more inline with http://jcp.org/en/jsr/detail?id=305. JSR 303 is related to runtime constraint checking.

Typically the @NotNull is implemented as a runtime or code injected check. Static analysis can also be applied. I would imagine this could be supported generically assuming the checker or IDE were to "call" the validation code, it could check constants for example or give hints for code completion using this information.

> That being said, we will try to offer the necessary API to support method and parameters validation. An interceptor or AOP framework will then be able to use Bean Validations to validate the parameters.
>
>

>> Example 2.5: Length constraint validator.
>>
>> This example uses an unneeded check for type correctness which is actually less helpful than the default behaviour.
>>
>> Option a)
>> if ( !( value instanceof String ) ) {
>> throw new IllegalArgumentException("Expected String type");
>> }
>> String string = (String) value;
>>
>> Prints
>>
>> java.lang.IllegalArgumentException: Expected String type
>>
>> Option b)
>>
>> String string = (String) value;
>>
>> Prints
>>
>> java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String
>>
>> The ClassCastException in Java used to print a message like option a) but it was deemed to be less informative than stating what the class actually was. So this was added. Your example is actually a backward step. IMHO.

>
> Yes some people mentioned that point. We probably will define some recommendations along those lines.

>>
>>
>> Could we have option b, it shorter and produces a clearer error message.
>>
>> In fact, why not have
>>
>> public interface Constraint<A extends Annotation, T> {
>> boolean isValid(T value);
>> }

> This is problematic in a couple of ways. A constraint could be bound to a given type but does not have to, it might check several different types (String, Numbers etc).

In that case T would be the super type of these, even be Object. However for documentation and simplification of the isValid it would be better if the cast not be required unless generics does do this for you.

> In this situation the constraint must accept Objects and then check the types to make sure they are in the accepted subset.
It would accept Object via type erasure (T implicitly extends Object), but it would throw ClassCastException
> You end up in a situation where the type checking is sometimes done by the framework in one way, and sometimes done by the cosntraint implementation in a different way.
> So the choice has been to leave the isValid "ungenerified".

Not sure what that gives you. I imagine the call would look like the following. Can you tell me the difference between the Constraint being generic or not?

Object myObject = ...;
Constraint constraint = ... ;
boolean valid = false;
try {
valid = constraint.isValid(myObject);
} catch(IllegalArgumentException e) {
// ignored;
} catch(ClassCastException e) {
// ignored;
}

>> The example could then be
>>
>> public boolean isValid(@Nullable String string) {
>> int length = string.length();
>> return length >= min && length <= max;
>> }
>>
>> Lets say we have a username or password checker. The min length is 6 and max is 16.
>> I wouldn't agree that accepting null as valid "… demonstrates some best practices."
>> I cannot think of an example where you wanted a minimum length but null would be fine.

> I can think of plenty. Age, credit card number etc.
So a person with a null Age is valid? Or an order with a null credit card? :)
>

>> public boolean isValid(@NotNull String string) {
>> int length = string.length();
>> return length >= min && length <= max;
>> }
>>
>> Section 2.5, first example, not numbered.
>>
>> /**
>> * Defines the object nullability.
>> * TRUE means the object is nullable,
>> * FALSE means the object is not nullable,
>> * NULL means does not apply.
>> */
>>
>> Can you provide an example where an object, field or primitive is neither nullable nor not nullable?
>> IMHO: All primitives are not nullable, all objects are either nullable or not nullable.

> It's not so much that an opbject is nullable or not. It's more than the constraint does not care
> @NotNull
I am sure @NotNull cares.
> @Email
> String email;

Nullable means it could be null or not null. What does it mean to be more ambivalent than that?

>
> @Email has some size checking but does not care about the nullability (this is handled by a different annotation since it's an orthogonal cutting validation).

I would have assumed the constraints require AND logic. I would have assumed if any constraint is not null, the value cannot be null.

What would be the difference in the this case if
1) @Email were nullable
2) @Email is ambivalent about nullability?

>> Example 2.6
>> We have a method which could read
>> public Integer getLength() {
>> return max == Integer.MAX_VALUE ? null : max;
>> }

>
> ternary operators are a matter of taste

true.

>>
>> Could clarify the advantage of this extra complexity. It is not clear what you would loose by having
>>
>> return max;

> if max reaches MAX_VALUE it means it's the default value in annotations. Which means that the user did not set it.
I would have thought null is the value which means this was not set.
> So there is no reason to pass the value along. This metadata API is meant for interoperability beyond Java (DDL, JS etc)

So this is to clarify when the developer specifically wants to set the max size to 2,147,483,647 not to be confused with this just being a large number. If interoperability is required, should this length be a long (64-bit value) ?

>> Java doesn't support String, arrays etc which are longer than Integer.MAX_VALUE in any case.
>>
>> 3.6 Examples.
>>
>> What would be the impact of dropping the setters and getters. (Other than dropping most of the code)
>> Your examples with JavaBeans would be less than half as long.
>>
>> @ZipCodeCityCoherenceChecker
>> public class Address {
>> @NotNull @Length(min=1, max=30)
>> public String addressline1;
>>
>> @NotNull @Length(max=30)
>> public String addressline2 = "";
>>
>> @NotNull @Length(max=11)
>> public String zipCode = "";
>>
>> @NotNull @Length(min=1, max=30)
>> public String city;
>> }
>>
>> Address address = new Address() {
>> {
>> addressline1 = "address1";
>> city = Llanfairpwllgwyngyllgogerychwyrndrobwyll-llantysiliogogogoch";
>> }
>> }

> I want to show that mixing getter and field constraints is possible

I agree this is valuable and essential. I wanted to show that getters and setters, in most cases, are a matter of taste. :)

>> Are you looking at instrumentation to perform the checks as values are changed in a fail fast approach (rather than some unknown time later)
>>
>> This would avoid the need for developers to remember to perform the checks and could allow a container/framework to perform them transparently.

> This is an interesting idea but I don't think it applies everywhere. In a lot of situations, the object needs to be filled up entirely before running validations on it.

In my case, where the framework creates the object or sets it, it has all the values required for validation.

> But as I said before, the spec will likely have the necessary APIs to let an instrumentation model play well.

I would hope this doesn't add too much complexity, rather that it should be considered. Obviously the focus of this JSR is support for JavaBeans. What annoys me is JavaBean specific libraries which could almost work with POJOs, but has some fundamental limitation.

If I could change just one thing it would be to add @Target(ElementType.PARAMETER) and the rest is just ?enhancements?

>
> Thanks for you feedback! If you have more or want to reply on my comments, please go to this forum http://forum.hibernate.org/viewforum.php?f=26

I will add a cut down version of this to the forum
Thank you again for the response. I think this is a valuable JSR.


Top
 Profile  
 
Display posts from previous:  Sort by  
Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 1 post ] 

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
© Copyright 2014, Red Hat Inc. All rights reserved. JBoss and Hibernate are registered trademarks and servicemarks of Red Hat, Inc.