Hi all,
My sysadmin reported me an exception "too many open files" in my Hibernate Search Project.
I have been searching and I just found this:
Quote:
Why am I getting an IOException that says "Too many open files"?
The number of files that can be opened simultaneously is a system-wide limitation of your operating system. Lucene might cause this problem as it can open quite some files depending on how you use it, but the problem might also be somewhere else.
Always make sure that you explicitly close all file handles you open, especially in case of errors. Use a try/catch/finally block to open the files, i.e. open them in the try block, close them in the finally block. Remember that Java doesn't have destructors, so don't close file handles in a finalize method -- this method is not guaranteed to be executed.
Use the compound file format (it's activated by default starting with Lucene 1.4) by calling IndexWriter's setUseCompoundFile(true)
Don't set IndexWriter's mergeFactor to large values. Large values speed up indexing but increase the number of files that need to be opened simultaneously.
Make sure you only open one IndexSearcher, and share it among all of the threads that are doing searches -- this is safe, and it will minimize the number of files that are open concurently.
Try to increase the number of files that can be opened simultaneously. On Linux using bash this can be done by calling ulimit -n <number>.
anybody had this problem? how can i solve it?