-->
These old forums are deprecated now and set to read-only. We are waiting for you on our new forums!
More modern, Discourse-based and with GitHub/Google/Twitter authentication built-in.

All times are UTC - 5 hours [ DST ]



Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 12 posts ] 
Author Message
 Post subject: How to lazy load Blob fields?
PostPosted: Thu Jan 29, 2004 12:32 am 
Newbie

Joined: Wed Dec 03, 2003 11:42 pm
Posts: 7
Location: Singapore
Hi

Our application has a table that store 4 big blob fields (about 12MB each of them). We need to load mutiple records from this tables and display links to the user to download blob objects as binary files.

Our problem is it seems that Hibernate loads in the actual contents of these blob into memory when loading the objects. It means if I load 10 objects from the table, it would take 480MB memory. I think the best way is to delay the loading of these blob fields when loading the object.

I read through the forum, and some suggested to use some helper JDBC code which actually does the reading/writting. Could any one show me how to do it with Hibernate?

Many thanks in advance.


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jan 29, 2004 12:34 am 
Hibernate Team
Hibernate Team

Joined: Tue Aug 26, 2003 12:50 pm
Posts: 5130
Location: Melbourne, Australia
No, Hibernate does not do this, as long as you map it as type "blob".


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jan 29, 2004 12:38 am 
Newbie

Joined: Wed Dec 03, 2003 11:42 pm
Posts: 7
Location: Singapore
Thanks for your reply so quickly.

Do you have suggestions on where could I find those helper JDBC codes which do the lazy loading?

If these blob fields should not be mapped as "blob" in Hibernate, which type should I map them?


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jan 29, 2004 12:41 am 
Hibernate Team
Hibernate Team

Joined: Tue Aug 26, 2003 12:50 pm
Posts: 5130
Location: Melbourne, Australia
Quote:
If these blob fields should not be mapped as "blob" in Hibernate, which type should I map them?


You should map them as type blob.

Quote:
do you have suggestions on where could I find those helper JDBC codes which do the lazy loading?


ummm. in your JDBC driver.


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jan 29, 2004 12:50 am 
Newbie

Joined: Wed Dec 03, 2003 11:42 pm
Posts: 7
Location: Singapore
Do you mean I should map only non-blob fields? For example, use Hibernate to load these non-blob fields, and use JDBC connection to load the blob later?


Top
 Profile  
 
 Post subject:
PostPosted: Thu Jan 29, 2004 1:03 am 
Hibernate Team
Hibernate Team

Joined: Tue Aug 26, 2003 12:50 pm
Posts: 5130
Location: Melbourne, Australia
No.

You should go and read the javadoc for java.sql.Blob.


Top
 Profile  
 
 Post subject:
PostPosted: Fri Jan 30, 2004 4:50 am 
Newbie

Joined: Wed Dec 03, 2003 11:42 pm
Posts: 7
Location: Singapore
I finally found a walk around to solve the lazy loading of Blob without requiring the database server and JDBC driver to support Blob locators.

Although Hibernate does not allow lazy loading a column, lazy loading a row is supported. Therefore, I separate the Blob columns into a new table and the original table only stores the key to the new table. I set up the many-to-one mapping from the original table to the new table, and make the new table lazy loaded in Hibernate configuration.

Now I can display hundreds of links in the same web page, each link points to about 12 MB, and keep the momery comsumption low.

Hope this could help some Hibernate users.


Top
 Profile  
 
 Post subject:
PostPosted: Fri Jan 30, 2004 11:09 am 
Hibernate Team
Hibernate Team

Joined: Tue Aug 26, 2003 12:50 pm
Posts: 5130
Location: Melbourne, Australia
Listen, a Blob, if implemented correctly by the JDBC diver is by nature lazy. Please read the Javadoc!


Top
 Profile  
 
 Post subject:
PostPosted: Sat Jan 31, 2004 12:51 am 
Newbie

Joined: Wed Dec 03, 2003 11:42 pm
Posts: 7
Location: Singapore
I understand this. However, MySQL 4.0.17 and ConnectorJ 3.0.10 does not support Blob locators yet. It means when I load a Blob, ConnectorJ returns the contents as well, rather than just a locator. That's why I encountered memory problems previously.

The walk around I mentioned is actually not lazy loading a blob, it is lazy loading a row with blob field. It might sound strange, but it solves my problem.


Top
 Profile  
 
 Post subject:
PostPosted: Wed Mar 24, 2004 5:55 pm 
Regular
Regular

Joined: Wed Dec 17, 2003 1:58 pm
Posts: 102
Cloudman did you move that blob into a different java object? or how exactly did you accomplish lazy loading?
thanks,
David


Top
 Profile  
 
 Post subject:
PostPosted: Wed Mar 24, 2004 10:17 pm 
Newbie

Joined: Wed Dec 03, 2003 11:42 pm
Posts: 7
Location: Singapore
You do need to create a class for the Blob data only. It looks like the following:

Code:
public class BinaryData {
    private Long id;
    private Blob data;

    public Long getId() {
        return id;
    }

    public void setId(Long id) {
        this.id = id;
    }

    public Blob getData() {
        return data;
    }

    public void setData(Blob data) {
        this.data = data;
    }
}


In your data record that previously uses Blob fields, just replace the Blob with the above class. You can use a many-to-one mapping from the host data object to the above binary data object, and specify lazy loading in the host data object.

Hope this helps.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Mar 29, 2004 12:19 am 
Regular
Regular

Joined: Wed Dec 17, 2003 1:58 pm
Posts: 102
Perfect thanks =)


Top
 Profile  
 
Display posts from previous:  Sort by  
Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 12 posts ] 

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
© Copyright 2014, Red Hat Inc. All rights reserved. JBoss and Hibernate are registered trademarks and servicemarks of Red Hat, Inc.