Hi there,
I'm afraid this is something close to an FAQ; searching finds lots of material, but no good
complete summary of what's the right thing do. So here we go:
We'd like to use application-defined UUIDs as primary keys, and would like them to be stored as strings in the database for readability (we're not concerned with compactness or speed for now).
In Hibernate 3.5, we used a custom UserType, written based on the many FAQs in the net. Using that user type, we managed to get the ID column be created as varchar(1), which didn't seem to bother HSQL. Adding "length=37" to the @Column annotation actually changes the length in HSQL (where it didn't matter), but apparently not in Oracle (where it
does matter).
We now upgraded to Hibernate 3.6, where org.hibernate.type.UUIDCharType seemed to promise to do what we want. However, for HSQL I see this:
Code:
id varchar(255) not null
and, after annotating with length=37:
Code:
id varchar(37) not null
Shouldn't the length default (for the string representation of UUID) be applied automatically? And will Hibernate create the proper definitions for Oracle now? (can't test with Oracle right now).
Best regards, and sorry for the newbie-ish questions,
Julian