-->
These old forums are deprecated now and set to read-only. We are waiting for you on our new forums!
More modern, Discourse-based and with GitHub/Google/Twitter authentication built-in.

All times are UTC - 5 hours [ DST ]



Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 1 post ] 
Author Message
 Post subject: java.io.IOException: The stream is closed
PostPosted: Thu Jun 28, 2012 12:13 am 
Newbie

Joined: Wed Jun 27, 2012 9:48 pm
Posts: 1
I receive the following exception when getting a binary stream.
Code:
        java.io.IOException: The stream is closed.
   at com.microsoft.sqlserver.jdbc.BaseInputStream.checkClosed(SimpleInputStream.java:93)
   at com.microsoft.sqlserver.jdbc.PLPInputStream.read(PLPInputStream.java:237)
   at au.gov.vic.doi.tp.service.binary.dao.BinaryDAOImplTest.retrieveBinary(BinaryDAOImplTest.java:102)

I am using: Hibernate 4.1.1, MS SQL Server 2008 and the latest MS SQL JDBC4 driver.
I use the following custom mapping type 'BlobToStreamUserType' to map from SQL Blob (varchar(max)) to InputStream. Hibernate don't appear to provide a type that maps an SQL BLOB to Java InputStream, the MaterializedBlobType only maps an SQL BLOB to Java byte[].

I am using 'rs.getBinaryStream' to return a true binary stream not a stream buffered in memory, the reason being is that we have large images that if multiple users accessed concurrently would cause memory issues.

According to microsoft
Quote:
The Microsoft driver follows the JDBC spec with respect to getBinaryStream =>

Note: All the data in the returned stream must be read prior to getting the value of any other column. The next call to a getter method implicitly closes the stream. Also, a stream may return 0 when the method InputStream.available is called whether there is data available or not.

So getBinaryStream is designed to return a true stream of bytes. We return a true stream of bytes that is NOT buffered in memory, it is a stream that is read right off the wire from the server response. You cannot close the contentDataResultSet until you fetch all the bytes for our implementation.


I have tried setting the hibernate property 'hibernate.jdbc.use_streams_for_binary' to true and false however this makes no difference.

Here is my code:

Image entity class showing mapping to custom user type.
Code:
@Entity
@DynamicInsert(false)
@DynamicUpdate(false)
@Table(name = "Binary")
public class Binary
{
.........
    @Lob
    @Column(name = "image")
    @Type(type = "my.package.BlobToStreamUserType")   
    public InputStream getImageStream()
    {
        return this.imageStream;
    }
}


here is my custom mapping type:

Code:
public class BlobToStreamUserType implements UserType
{

    public int[] sqlTypes()
    {
        return new int[]
        { Types.BLOB };
    }

    public Class returnedClass()
    {
        return InputStream.class;
    }

    public Object nullSafeGet(final ResultSet rs, final String[] names, final SessionImplementor session,
            final Object owner) throws HibernateException, SQLException
    {
        return rs.getBinaryStream(names[0]);
    }

    public void nullSafeSet(final PreparedStatement st, final Object value, final int index,
            final SessionImplementor session) throws HibernateException, SQLException
    {

        if (value != null)
        {
            if (!InputStream.class.isAssignableFrom(value.getClass()))
            {
                throw new HibernateException(value.getClass().toString() + " cannot be cast to a java.IO.InputStream");
            }
            InputStream inStream = (InputStream) value;
            st.setBinaryStream(index, inStream);
        }
        else
        {
            st.setBytes(index, null);
        }
    }

    public boolean equals(final Object x, final Object y) throws HibernateException
    {
        return ObjectUtils.equals(x, y);
    }

    public int hashCode(final Object x) throws HibernateException
    {
        assert (x != null);
        return x.hashCode();
    }

    public Object deepCopy(final Object value) throws HibernateException
    {
        return value;
    }

    public boolean isMutable()
    {
        return false;
    }

    public Object replace(final Object original, final Object target, final Object owner) throws HibernateException
    {
        return original;
    }

    public Serializable disassemble(final Object value) throws HibernateException
    {
        //disassemble() is only called when caching the data in the second-level cache
        // also safe for mutable objects
        throw new UnsupportedOperationException("Not supported yet.");
    }

    public Object assemble(final Serializable cached, final Object owner) throws HibernateException
    {
        throw new UnsupportedOperationException("Not supported yet.");
    }

}


and here is my DAO to retrieve the image stream

Code:
public class BinaryDAOImpl extends AbstractSpringHibernateDAO implements BinaryDAO
{
   
    public Binary getBinary(final String amsAssetId, final String assetMaintainer, final String variant,
            final Integer sequence) throws GetException
    {

        try
        {
            Query query = super.getSession().createQuery(
                    "from Binary bin where bin.amsAssetId = :amsAssetId and bin.assetMaintainer = :assetMaintainer "
                            + "and bin.variant = :variant and bin.sequence = :sequence");
            query.setParameter("amsAssetId", amsAssetId);
            query.setParameter("assetMaintainer", assetMaintainer);
            query.setParameter("variant", variant);
            query.setParameter("sequence", sequence);
            return query.uniqueResult();
        }


N.b The following workaround to bypass hibernate and retrieve the stream directly from the result set works and allows me to read the stream without exception. However I would rather use hibernate to map from a blob to stream if possible.

Modified DAO method
Code:

    public Binary getBinary(final String amsAssetId, final String assetMaintainer, final String variant,
            final Integer sequence) throws GetException
    {
        try
        {
            Query query = super.getSession().createQuery(
                    "from Binary bin where bin.amsAssetId = :amsAssetId and bin.assetMaintainer = :assetMaintainer "
                            + "and bin.variant = :variant and bin.sequence = :sequence");
            query.setParameter("amsAssetId", amsAssetId);
            query.setParameter("assetMaintainer", assetMaintainer);
            query.setParameter("variant", variant);
            query.setParameter("sequence", sequence);
            Binary binary = (Binary) query.uniqueResult();

            final String id = binary.getId();
           
            // Necessary until Hibernate supports InputStream mapping
            binary.setImageStream(getSession().doReturningWork(new ReturningWork<InputStream>()
            {
                          public InputStream execute(final Connection connection) throws SQLException
                            {
                                PreparedStatement statement = connection.prepareStatement("select image from Binary where id=?",
                                        ResultSet.TYPE_SCROLL_INSENSITIVE, ResultSet.CONCUR_READ_ONLY);
           
                                statement.setString(1, id);
           
                                ResultSet resultSet = statement.executeQuery();
           
                                return (resultSet.first() ? resultSet.getBinaryStream(1) : null);
                            }
                        }));

            return binary;

        }


Top
 Profile  
 
Display posts from previous:  Sort by  
Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 1 post ] 

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
cron
© Copyright 2014, Red Hat Inc. All rights reserved. JBoss and Hibernate are registered trademarks and servicemarks of Red Hat, Inc.