we are currently evaluating Hibernate Search as replacement for our propriatary Spring LuceneManager. I really like the idea of solving performance critical queries with Lucene instead of DB queries and I've already worked on a commercial persistence layer that used exactly the same approach and worked great that way.
but here's the problem: i added a simple test-case to our product's test framework, to play around with Hibernate Search. The test case inherits from a general test-case that already sets up all the Spring/Hibernate architecture. During the setUp method already existing Hibernate objects are deleted from the underlying database (we wanna test several different db providers, not just hsqldb), meaning that there exists a transaction where first occurs an object update and than an object deletion. What cost me quite a lot of time was figuring out my first test-case failed (it just creates a simple persistent object and looks for it in the index) until i debugged down to DocumentBuilder where i found this code-piece:
Code:
for (LuceneWork luceneWork : queue) {
//any work on the same entity should be ignored
if ( luceneWork.getEntityClass() == entityClass
) {
Serializable currentId = luceneWork.getId();
if ( currentId != null && currentId.equals( id ) ) { //find a way to use Type.equals(x,y)
return;
}
//TODO do something to avoid multiple PURGE ALL and OPTIMIZE
}
}
in my case this leads to the following behaviour:
*) since the persistent object is updated during the init process the queue contains: [DELETE, ADD]
*) in the same transaction the object gets deleted but the above code piece causes document builder to
ignore this circumstance. the queue should contain [DELETE,ADD,DELETE] or [DELETE] but contains [DELETE,ADD]. As a consequence the object isn't deleted from the underlying lucene index and the index is inconsistent
am i getting something wrong (didn't find anything about this issue in the bugtracking system/forum)?