Context: Not a hibernate expert, working with postgres 9.4+ jsonb fields. When doing parameterized update with plain jdbc I can simply create a
org.postgresql.util.PGobject and pass it as a parameter to
PreparedStatement.setObject. It might be not reasonable to expect similar behavior from Hibernate, but the current behavior leaves me puzzled:
When a
AttributeConverter<Map, Object> is used to build a similar
PGobject, the following exception is raised by the postgres jdbc driver:
Code:
org.postgresql.util.PSQLException: Unsupported Types value: 1.029.991.479
and the source of this type value can be traced to the following method of the
JdbcTypeJavaClassMappings:
Code:
public int determineJdbcTypeCodeForJavaClass(Class cls) {
....
int specialCode = cls.hashCode();
log.debug(
"JDBC type code mapping not known for class [" + cls.getName() + "]; using custom code [" + specialCode + "]"
);
return specialCode;
}
My question is: why to take the hashcode of the class of the attribute and use it as an "unknown" java.sql.Types value? Is there a good explanation for this seemingly random choice? If so, wouldn't it be nice to add it as a comment line, or better yet as a test?
If the "special code" is changed say to
java.sql.Types#OTHER, then no test of the
hibernate-orm project fails and my mapping via
org.postgresql.util.PGobject to
jsonb starts to work. I would have expected this or some other constant to be used instead of random hash code that can (theoretrically) clash with other predefined values.
Reproduced: Hibernate 5.2.6.Final and 5.2.7.Final, jdbc driver org.postgresql:postgresql:9.4.1212.jre7, PostgreSQL 9.5.5 via
dockerA small demo is available at
https://github.com/mgurov/pghibernatejsonbPlease let me know if I can provide more info on the matter