11

I've got a problem showing data from a database. If I update an object, sometimes I get the old data, and sometimes the new one. The Update function works well (I can see the right update in the DB), while the read function seems to get cached data. I've tried to disable both caches, and tried to open and close session during update/save, but it's still not working.
Both User and Store beans have Lazy fetches.

READ FUNCTION:

    public static List<Store> getStoreByUser(User user)
        throws HibernateException {
    List<Store> result = null;
    Session session = sessionFactory.getCurrentSession();   
    Transaction transaction = null;
    try {
        transaction = session.getTransaction();
        Criteria criteria = session.createCriteria(Store.class);
        criteria.add(Restrictions.eq("userUserID", user));
        result = criteria.list();
    } catch (HibernateException he) {
        logger.error("No Store found for user: = " + user, he);
        throw he;
    }
    return result;
}

UPDATE/SAVE FUNCTION:

    public static Integer insertOrUpdateStore(Store store)
        throws HibernateException {
    Integer id = null;
    Session session = sessionFactory.getCurrentSession();   
    Transaction transaction = null;
    try {
                    transaction = session.getTransaction();
        if (store.getStoreID() != null && store.getStoreID() != 0) {
            
            session.merge(store);
            transaction.commit();
                            
        } else {
                id = (Integer) session.save(store);
            transaction.commit();               
        }
    } catch (HibernateException he) {
        if (transaction != null) {
            transaction.rollback();
        }
    } finally {
    }       
    return id;
}
greybeard
  • 2,249
  • 8
  • 30
  • 66
Gore
  • 199
  • 1
  • 2
  • 12

4 Answers4

6

I have the same problem, my query select * returning old data. I have in my hibernate.cfg.xml file turned off 2nd level cache like this

<property name="hibernate.cache.use_second_level_cache">false</property>
<property name="hibernate.cache.use_query_cache">false</property>
<property name="hibernate.c3p0.max_statements">0</property>

I'll try add session.flush() or session.clear() before/after transaction.commit() but it not gives positive result

mario
  • 61
  • 1
  • 1
5

Usually, you have an isolation level "read committed". This lets your transaction see changes that have been committed by other transaction. The isolation level is implemented by the underlying dbms, not by hibernate.

You can't disable the first level cache (probably by using the stateless session, which shouldn't be used for general purposes). When executing a query, NH always returns values from the cache when it is found there, to ensure that you don't get the same db record more then once in memory.

If this is an issue for you, you should switch to a higher isolation level. For instance repeatable read (which means what it says: when reading the same data several times, you always get the same result). There is still the chance to see changes from other transaction. Using isolation level serializable, you shouldn't have this kind of issue anymore.

Note: switching to another isolation level is a breaking change to most systems and should be planned carefully.

Stefan Steinegger
  • 63,782
  • 15
  • 129
  • 193
  • 5
    I don't know which was my default isolation value, however I've switched it into "read committed" adding: 2 in hibernate.cfg.xml and now it works!! I've spent so much time in finding a solution; your explanation was so clear! Thanks a lot Stefan! – Gore Aug 20 '12 at 12:59
  • After few tries I've noticed that dirty datas are still present, and sometimes they pop up... I probably need a fresh working example because I don't know what I'm doing wrong. – Gore Aug 21 '12 at 10:24
  • Read my answer carefully: when using "read committed", it can still happen. You need to raise the isolation level. – Stefan Steinegger Aug 21 '12 at 11:34
  • If I put level 4 or 8 (serializable) I get an error: org.hibernate.exception.GenericJDBCException: Lock wait timeout exceeded; try restarting transaction. It's probably a deadlock. I think It's weird I've this kind of issue, probably there's something wrong when I read or update data. What about session and transaction? May I've open/close them everytime I update/read? – Gore Aug 21 '12 at 13:04
  • You should keep the session and also the transaction for the whole business process (which should be still as short as possible). There must be running a lot in parallel on the same data. Timeouts and deadlocks depend on the implementation of serializable in the underlying dbms. E.g. SQL Server has a lot of locking. Oracle is much better. I wonder what you are doing in these parallel transactions ... – Stefan Steinegger Aug 21 '12 at 13:15
  • So my update code should be ok? So I cannot use 8 value (serializable). Setting other values I still get dirty datas :/ – Gore Aug 22 '12 at 08:23
  • Helpful reference regarding isolation level, how to change it and what impact it may cause: [Transaction Isolation Level](http://stackoverflow.com/questions/16162357/transaction-isolation-levels-relation-with-locks-on-table) – Panini Luncher Nov 04 '14 at 19:39
4

You can open a new session in order to get "fresh"(updated from database) data without old entities from session cache. In the example below you can see as an entity is being querying from database. Also you can use the same mechanism to return the entity instead of a boolean value, or call session.Refresh() (from your current session of course) to refresh latest changes from database:

        /// <summary>
        ///  Gets an item exists on database.
        /// </summary>
        /// <param name="Id">Item ID to check if exists on database</param>
        /// <returns> True if the <paramref name="Id"/> to check exists on database, otherwise false. </returns>
        public bool Exists<T>(object Id)
        {
            using (var session = NHibernateSessionHelper.OpenSession())
            {
                using (var transaction = session.BeginTransaction())
                {
                    //get if the item is new. If is new, entity will be null and non exists
                    return session.Get<T>(Id) == null ? false : true;
                    //also you can return the entire table from database, or filter:
                    //session.CreateCriteria<T>().List<T>();
                }
            }
        }

        public void Refresh(object entity)
        {
            //get latest changes from database (other users/sessions changes, manually, etc..) 
            NHibernateSessionHelper.CurrentSession.Refresh(entity);
        }

I hope this helps you.

Carlos
  • 456
  • 2
  • 7
  • 21
  • In some scenarios, this one is not working for me. I already create a new session with every operation and close it afterwards and I print the session to see for sure that it's all empty – kommradHomer Jan 18 '21 at 07:59
3

You may evict old data so as to ensure that data is fetched from db and not from L1 and L2 caches.

Also you have to make sure your are not in a REPEATABLE_READ isolation level. In this isolation mode, it guarantees that two reads within the same transaction will always return the same result. As the isolation level takes precedence on your cache eviction, the cache evict will not have any visible effect.

There is a ways to work around this behavior: Declare your transaction isolation level to READ_UNCOMMITTED or READ_COMMITTED.

SilverlightFox
  • 32,436
  • 11
  • 76
  • 145
  • "two reads within the same transaction will always return the same result". This is a bit misleading. `REPEATABLE READ` **will** show changed data as long as it was made within the same transaction. – Zyl Dec 15 '21 at 13:44