-->
These old forums are deprecated now and set to read-only. We are waiting for you on our new forums!
More modern, Discourse-based and with GitHub/Google/Twitter authentication built-in.

All times are UTC - 5 hours [ DST ]



Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 1 post ] 
Author Message
 Post subject: First-level (Session) Caching Performance Question
PostPosted: Mon Aug 18, 2008 6:15 pm 
Newbie

Joined: Mon Aug 18, 2008 2:19 pm
Posts: 4
I seem to be running afoul of something in the way I am using hibernate to access and read data from a read-only database. I am accessing roughly 2000 entities (two types: AnalysisImpl and FeatureImpl) via Session.get(), and then lazily accessing some collections on these. All in all, Hibernate is loading around 70k objects into its caches. If I do this several times in one session it is taking roughly 20 seconds each time, and (of course) no second-level cache is accessed in doing this. However, if I explicitly clear the first-level (session) cache between invocations, the performance on the second and third calls, now using second-level cache, is much better. Am I truly interpreting this correctly? And if so, does this mean I need to be clearing the session cache more often (even within the invocation) to get better performance? Is second-level caching truly more performant than session caching? Is there a guideline on how many accesses can be done before the session cache performance degrades and a flush/clear is needed?

The example code below is a contrived version of my production code, but shows the same symptoms as the production code.


Hibernate version: 3.2.5
EhCache: 1.3.0
Spring: 2.5.2


Mapping documents:
Here are the annotated classes for AnalysisImpl and FeatureImpl, as well as the (spring-context) configuration for the metadataSessionFactory:

Code:
@Entity
@Table(name = "differential_analysis")
@Cache(usage = CacheConcurrencyStrategy.READ_ONLY)
public class AnalysisImpl implements Analysis, Serializable {

   private static final long serialVersionUID = -263751583851735504L;

   private Dataset dataset;
   private long databaseId;
   private SampleProperty sampleProperty;
   private List<PredefinedClass> predefinedClasses;
   private AnalysisId id;
   private Collection<AnalysisPropertyValue> propertyValues;

   public AnalysisImpl() {
      this.id = AnalysisId.NULL;
      this.predefinedClasses = new ArrayList<PredefinedClass>(0);
      this.propertyValues = new ArrayList<AnalysisPropertyValue>(0);
   }

   @Transient
   public AnalysisId getId() {
      return id;
   }

   @Id
   @Column(name = "id")
   public long getDatabaseId() {
      return databaseId;
   }

   @ManyToOne(targetEntity = SamplePropertyImpl.class, fetch = FetchType.LAZY)
   @JoinColumn(name = "sample_property_id")
   @Cache(usage = CacheConcurrencyStrategy.READ_ONLY)
   public SampleProperty getSampleProperty() {
      return sampleProperty;
   }

   @ManyToOne(targetEntity = DatasetImpl.class, fetch = FetchType.LAZY)
   @JoinColumn(name = "dataset_id")
   @Cache(usage = CacheConcurrencyStrategy.READ_ONLY)
   public Dataset getDataset() {
      return dataset;
   }

   @OneToMany(targetEntity = PredefinedClassImpl.class, fetch = FetchType.LAZY, mappedBy = "pk.analysis")
   @OrderBy(clause = "class_number DESC")
   @Cache(usage = CacheConcurrencyStrategy.READ_ONLY)
   public List<PredefinedClass> getPredefinedClasses() {
      return predefinedClasses;
   }

   @ManyToMany(targetEntity = AnalysisPropertyValueImpl.class, fetch = FetchType.LAZY)
   @JoinTable(name = "analysis_property_assign", joinColumns = @JoinColumn(name = "analysis_id"), inverseJoinColumns = @JoinColumn(name = "analysis_property_value_id"))
   @Cache(usage = CacheConcurrencyStrategy.READ_ONLY)
   public Collection<AnalysisPropertyValue> getAnalysisPropertyValues() {
      return propertyValues;
   }

   public void setDatabaseId(long databaseId) {
      this.databaseId = databaseId;
      this.id = AnalysisId.instance(databaseId);
   }

   public void setDataset(Dataset dataset) {
      this.dataset = dataset;
   }

   public void setSampleProperty(SampleProperty sampleProperty) {
      this.sampleProperty = sampleProperty;
   }

   public void setPredefinedClasses(List<PredefinedClass> predefinedClasses) {
      this.predefinedClasses = predefinedClasses;
   }

   public void setEmbeddedTestType(TestType testType) {
      this.embeddedTestType = testType;
   }

   public void setAnalysisPropertyValues(Collection<AnalysisPropertyValue> propertyValues) {
      this.propertyValues = propertyValues;
   }

   @Override
   public boolean equals(Object obj) {
      if (!(obj instanceof AnalysisImpl)) {
         return false;
      }
      AnalysisImpl that = (AnalysisImpl) obj;
      return this.databaseId == that.databaseId;
   }

   @Override
   public int hashCode() {
      return (int) this.databaseId;
   }

}

Code:
@Entity
@Table(name = "feature")
@Cache(usage = CacheConcurrencyStrategy.READ_ONLY)
public class FeatureImpl implements Feature, Serializable {

   private static final long serialVersionUID = -3073287800453222177L;

   public static final Feature NULL = new FeatureImpl();

   private long databaseId;
   private Reporter reporter;
   private Platform platform;
   private FeatureId id;

   public FeatureImpl() {
      this.id = FeatureId.NULL;
   }

   @Id
   @Column(name = "id")
   public long getDatabaseId() {
      return this.databaseId;
   }

   @Transient
   public FeatureId getId() {
      return this.id;
   }

   @ManyToOne(targetEntity = ReporterImpl.class, fetch = FetchType.LAZY)
   @JoinColumn(name = "reporter_id")
   public Reporter getReporter() {
      return this.reporter;
   }

   @ManyToOne(targetEntity = PlatformImpl.class, fetch = FetchType.LAZY)
   @JoinColumn(name = "platform_id")
   public Platform getPlatform() {
      return this.platform;
   }

   public void setDatabaseId(long databaseId) {
      this.databaseId = databaseId;
      this.id = FeatureId.instance(databaseId);
   }

   void setReporter(Reporter reporter) {
      this.reporter = reporter;
   }

   void setPlatform(Platform platform) {
      this.platform = platform;
   }

   @Override
   public boolean equals(Object obj) {
      if (!(obj instanceof FeatureImpl)) {
         return false;
      }
      FeatureImpl that = (FeatureImpl) obj;
      return (this.databaseId == that.databaseId && this.reporter.equals(that.reporter));
   }

   @Override
   public int hashCode() {
      return (int) this.databaseId;
   }

}

Code:
   <bean id="metadataSessionFactory" class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean" depends-on="cacheManager">
      <property name="dataSource" ref="metadataDataSource"/>
      <property name="configurationClass" value="org.hibernate.cfg.AnnotationConfiguration"/>
      <property name="annotatedClasses">
         <list>
            <value>com.compendiabio.oncomine.model.hibernate.AnalysisImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.AnalysisImpl$TestType</value>
            <value>com.compendiabio.oncomine.model.hibernate.AnalysisPropertyImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.AnalysisPropertyValueImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.DatasetImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.DatasetPropertyImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.DatasetPropertyValueImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.FeatureImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.GeneImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.PlatformImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.PredefinedClassImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.OntologyCategoryImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.OntologyPropertyImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.OntologyPropertyImpl$PropertyType</value>
            <value>com.compendiabio.oncomine.model.hibernate.OntologyPropertyValueImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.OntologyPropertyValueSynonymImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.ReporterImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.SampleImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.SamplePropertyImpl</value>
            <value>com.compendiabio.oncomine.model.hibernate.SamplePropertyImpl$SamplePropertyType</value>
            <value>com.compendiabio.oncomine.model.hibernate.SamplePropertyValueImpl</value>
         </list>
      </property>
      <property name="hibernateProperties">
         <props>
            <prop key="hibernate.dialect">${hibernate.dialect}</prop>
            <prop key="hibernate.show_sql">${hibernate.show_sql}</prop>
            <prop key="hibernate.connection.pool_size">${hibernate.connection.pool_size}</prop>
            <prop key="hibernate.jdbc.batch_size">${hibernate.jdbc.batch_size}</prop>
            <prop key="hibernate.use_outer_join">${hibernate.use_outer_join}</prop>
            <prop key="hibernate.bytecode.use_reflection_optimizer">${hibernate.bytecode.use_reflection_optimizer}</prop>
            <prop key="hibernate.cache.provider_class">${hibernate.cache.provider_class}</prop>
            <prop key="hibernate.cache.use_second_level_cache">${hibernate.cache.use_second_level_cache}</prop>
            <prop key="hibernate.cache.use_query_cache">${hibernate.cache.use_query_cache}</prop>
            <prop key="hibernate.cache.use_structured_entries">${hibernate.cache.use_structured_entries}</prop>
            <prop key="hibernate.generate_statistics">${hibernate.generate_statistics}</prop>
            <prop key="hibernate.query.substitutions">${hibernate.query.substitutions}</prop>
         </props>
      </property>
      <property name="schemaUpdate" value="false"/>
   </bean>

with the following properties:

hibernate.dialect=org.hibernate.dialect.MySQL5InnoDBDialect
hibernate.show_sql=false
hibernate.connection.pool_size=5
hibernate.jdbc.batch_size=0
hibernate.use_outer_join=true
hibernate.bytecode.use_reflection_optimizer=true
hibernate.cache.provider_class=net.sf.ehcache.hibernate.SingletonEhCacheProvider
hibernate.cache.use_second_level_cache=true
hibernate.cache.use_query_cache=true
hibernate.cache.use_structured_entries=true
hibernate.generate_statistics=true
hibernate.query.substitutions=true 'Y', false 'N'



Code between sessionFactory.openSession() and session.close():
Code:
   public void testLoadObjects() throws Exception {
      long[] analysisIds = getAnalysisIds();
      int analysisCount = analysisIds.length;
      long[] featureIds = getFeatureIds();
      int featureCount = featureIds.length;
      System.out.println("Analysis Count = " + analysisCount);
      System.out.println("Feature Count = " + featureCount);

      Session session = this.metadataSessionFactory.openSession();

      printHibernateStats("Before:");

      TimingUtility timer = TimingUtility.instance(getName());

      fetchValues(session, analysisIds, featureIds, "Fetch #1");
      timer.timing("fetchValues 1");

      clearFirstLevelCache(session);

      fetchValues(session, analysisIds, featureIds, "Fetch #2");
      timer.timing("fetchValues 2");

      clearFirstLevelCache(session);

      fetchValues(session, analysisIds, featureIds, "Fetch #3");
      timer.timing("fetchValues 3");

      session.close();

      printHibernateStats("After:");

      timer.reportTimings(System.out);
   }

   @SuppressWarnings("unchecked")
   private void fetchValues(Session session, long[] analysisIds, long[] featureIds, String message) {
      int featureCount = featureIds.length;
      Set<Dataset> datasets = new HashSet<Dataset>();
      Set<Feature> features = new HashSet<Feature>();
      for (int i = 0; i < analysisIds.length; i++) {
         Feature feature = (Feature) session.get(FeatureImpl.class, featureIds[i % featureCount]);
         features.add(feature);
         Analysis analysis = (Analysis) session.get(AnalysisImpl.class, analysisIds[i]);
         datasets.add(analysis.getDataset());
      }

      for (Dataset dataset : datasets) {
         for (Sample sample : dataset.getSamples()) {
            for (SamplePropertyValue value : sample.getSamplePropertyValues()) {
               value.getSampleProperty();
            }
         }
      }
      printHibernateStats(message);
   }

   private void clearFirstLevelCache(Session session) {
      if (clearFirstLevelCache) {
         session.flush();
         session.clear();
      }
   }


Full stack trace of any exception that occurs:
No exceptions are generated.

Name and version of the database you are using:
Mysql 5.0.51a-community-nt

Output from test:

When run with the clearFirstLevelCache set to false, I see:
Code:
Available memory: 252M / 266M (532M max)
Analysis Count = 1859
Feature Count = 281
Before:
****** HIBERNATE STATISTICS *******
connectCount = 0
sessionOpen  = 1
sessionClose = 0
cachePut     = 0
cacheHit     = 0
cacheMiss    = 0
***********************************
Fetch #1
****** HIBERNATE STATISTICS *******
connectCount = 1
sessionOpen  = 1
sessionClose = 0
cachePut     = 73955
cacheHit     = 0
cacheMiss    = 28022
***********************************
Fetch #2
****** HIBERNATE STATISTICS *******
connectCount = 1
sessionOpen  = 1
sessionClose = 0
cachePut     = 73955
cacheHit     = 0
cacheMiss    = 28022
***********************************
Fetch #3
****** HIBERNATE STATISTICS *******
connectCount = 1
sessionOpen  = 1
sessionClose = 0
cachePut     = 73955
cacheHit     = 0
cacheMiss    = 28022
***********************************
After:
****** HIBERNATE STATISTICS *******
connectCount = 1
sessionOpen  = 1
sessionClose = 1
cachePut     = 73955
cacheHit     = 0
cacheMiss    = 28022
***********************************
Timings for testLoadObjects [59,937 ms]:
   fetchValues 1:   21,891
   fetchValues 2:   19,062
   fetchValues 3:   18,984
Available memory: 186M / 266M (532M max)


When run with clearFirstLevelCache set to true, I get:
Code:
Available memory: 252M / 266M (532M max)
Analysis Count = 1859
Feature Count = 281
Before:
****** HIBERNATE STATISTICS *******
connectCount = 0
sessionOpen  = 1
sessionClose = 0
cachePut     = 0
cacheHit     = 0
cacheMiss    = 0
***********************************
Fetch #1
****** HIBERNATE STATISTICS *******
connectCount = 1
sessionOpen  = 1
sessionClose = 0
cachePut     = 73955
cacheHit     = 0
cacheMiss    = 28022
***********************************
Fetch #2
****** HIBERNATE STATISTICS *******
connectCount = 1
sessionOpen  = 1
sessionClose = 0
cachePut     = 73955
cacheHit     = 73955
cacheMiss    = 28022
***********************************
Fetch #3
****** HIBERNATE STATISTICS *******
connectCount = 1
sessionOpen  = 1
sessionClose = 0
cachePut     = 73955
cacheHit     = 147910
cacheMiss    = 28022
***********************************
After:
****** HIBERNATE STATISTICS *******
connectCount = 1
sessionOpen  = 1
sessionClose = 1
cachePut     = 73955
cacheHit     = 147910
cacheMiss    = 28022
***********************************
Timings for testLoadObjects [32,828 ms]:
   fetchValues 1:   21,625
   fetchValues 2:   5,875
   fetchValues 3:   5,328
Available memory: 137M / 266M (532M max)


Any guidance or answers to the above questions would be welcome! I have done quite a bit of googling and searching of the forums and have not found the same situation described.

jeff


Top
 Profile  
 
Display posts from previous:  Sort by  
Forum locked This topic is locked, you cannot edit posts or make further replies.  [ 1 post ] 

All times are UTC - 5 hours [ DST ]


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
© Copyright 2014, Red Hat Inc. All rights reserved. JBoss and Hibernate are registered trademarks and servicemarks of Red Hat, Inc.