My application uses Hibernate 3.2.5, RichFaces 3.3.3, Tomcat 6.0.24 and SQL Server 2008.
The application accepts client files and stored these in the database (SQL Server (varbinary(max)) field).
File upload, serialization and storage is just fine; I have tested with files up to 848Mb with no problems.
File retrieval is however problematic; I'm struggling to find a method for retrieval that does not first read the entire file into memory on the server - my server has 8Gb insalled memory, Tomcat is configured for 4Gb max. Typically, with this configuration, I can retrieve 400Mb files but not 800Mb.
Here's my code:
Code:
String docid = request.getParameter("docid");
DocumentDao dao = new DocumentDao();
Document doc = dao.retrieveDocumentById(docid);
FacesContext fc = FacesContext.getCurrentInstance();
HttpServletResponse response = AppUtils.getCurrentResponse();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
try {
SerializableBlob sblob = (SerializableBlob) doc.getFdata();
Blob blob = sblob.getWrappedBlob();
MimetypeDao mDao = new MimetypeDao();
Mimetype mt = mDao.retrieveMimetypeById(doc.getMimetype_id());
response.setContentType(mt.getMimetype());
response.setHeader("Content-Disposition", "attachment; filename=" + doc.getFname());
response.setContentLength(baos.size());
ServletOutputStream outStream = response.getOutputStream();
// get stream from Blob
InputStream inStream = blob.getBinaryStream();
int length = -1;
byte[] buffer = new byte[4096];
// write data to output stream
while ((length = inStream.read(buffer)) != -1) {
baos.write(buffer, 0, length);
}
baos.writeTo(outStream);
baos.flush();
baos.close();
outStream.flush();
outStream.close();
inStream.close();
Can anyone suggest methods for data retrieval that limit the tomcat memory footprint?