1

I have an ASP.NET e-commerce application. My application has big traffic and I have to use caching system. I switch my cache mechanism from in-memory (memcache) to redis (distributed cache) because I have five servers.

There was no problem with memcache but I have a problem now after to switch redis. For example I serialize my list and it take about 4-5 MB but when I read it from redis-cli so it's fetched very slowly, about 5 seconds and my application raises timeout exception therefore.

I read that "do not store big data in redis". In other hand I use output cache and session cache with MVC for redis there is no problem. Should I use BinarySerializer or what? How can I store big objects in redis cache?

Thanks in advice.

    public List<ProdFilter> GetAll(int prodFeatureGroupId)
    {
        var cacheKey = string.Format("{0}_{1}_{2}",
                            "ProductRepository",
                            "GetAll",
                            prodFeatureGroupId);
        var isExists = _cache.Contains(cacheKey) && Const.CacheIsActive;

        List<ProdFilter> obj;
        if (isExists)
        {
            obj = _cache.Get<List<ProdFilter>>(cacheKey);
        }
        else
        {
            obj = _db.ProdFilters
                    .Where(x => x.ProdFilterGroupId == prodFeatureGroupId)
                    .ToList();
            _cache.Add(cacheKey, obj);
        }

        return obj;
    }

    //my cache methods
    private void Create()
    {
        if (null == _db)
        {
            ConfigurationOptions option = new ConfigurationOptions();
            option.Ssl = false;
            option.EndPoints.Add(_host, _port);
            var connect = ConnectionMultiplexer.Connect(option);

            _db = connect.GetDatabase();
        }
    }

    public void Add<T>(string key, T obj)
    {
        var data = Serialize(obj);
        _db.StringSet(key, data);
    }

    public void Set<T>(string key, T obj)
    {
        Add(key, obj);
    }

    public T Get<T>(string key)
    {
        var val = _db.StringGet(key);

        return Deserialize<T>(val);
    }

    // my serialize helper
    public static byte[] Serialize(object o)
    {
        if (o == null)
        {
            return null;
        }

        BinaryFormatter binaryFormatter = new BinaryFormatter();

        using (MemoryStream memoryStream = new MemoryStream())
        {
            binaryFormatter.Serialize(memoryStream, o);
            byte[] objectDataAsStream = memoryStream.ToArray();
            return objectDataAsStream;
        }
    }

    public static T Deserialize<T>(byte[] stream)
    {
        if (stream == null)
            return (default(T));

        BinaryFormatter binaryFormatter = new BinaryFormatter();

        using (MemoryStream memoryStream = new MemoryStream(stream))
        {
            T result = (T)binaryFormatter.Deserialize(memoryStream);
            return result;
        }
    }
Yargicx
  • 1,704
  • 3
  • 16
  • 36
  • Can you provide the source for `do not store big data in redis` ? Do you use list type to store your object ? can you provide details about the object ? – Ersoy Apr 25 '20 at 11:39
  • https://stackoverflow.com/a/41551058/3439554 – Yargicx Apr 25 '20 at 12:36
  • Yes, i use list type object. My entity has four property. ProdFilterId (int),ProdFilterName (string),ProdFilterGroupId (int),IsSefLinkFilter(bool) . And my list's item count about 300-500. – Yargicx Apr 25 '20 at 12:39
  • So it is not `big data in redis` but, `large objects in redis`. What i may suggest is you may take a look on hash data types to serialize 1 single object using properties as hash fields and hash values as values. May i ask you why did you prefer list for your objects ? it would be great if you update the post and provide example data for list. – Ersoy Apr 25 '20 at 12:48
  • The large objects are not considered since they are a serious problem for network. You need to get 5MB object to just fetch 1 identifier, also you need to serialize/deserialize it. also while updating you need to send the whole object back. Hash data structure may provides you more dynamic, network friendly, code friendly, memory friendly solution. – Ersoy Apr 25 '20 at 12:51
  • I've updated my question. I still use list object becuase i was using it with memory cache and there was no problem. i think i can use it same method with redis. i think i have to change my method to redis as you say. – Yargicx Apr 25 '20 at 13:04
  • Honestly, i didn't want to spend time to change my DAL's code. I wanted to use it as before when i was using memcache. – Yargicx Apr 25 '20 at 13:05
  • As far as i see you are using `string` data type to store serialized lists. There are keys(defined as strings) such as `ProductRepository_GetAll_1`, `ProductRepository_GetAll_n` and each key contains a value `list` consists of 300~500 elements with the total size of 5 mb. You are using `get` the whole serialized string and parse it. The solutions i may suggest will require a change in data type, means you will need to modify your string related commands to sorted set/set/hash related commands. – Ersoy Apr 25 '20 at 13:24
  • Your session or output cache's return value may not be as big as your serialized list values. that's the main problem - you are waiting the network to complete total size of 5mb of values whenever you execute `get` command. Also `set` is working in the same way, you wait your 5MB to deliver redis server. – Ersoy Apr 25 '20 at 13:27
  • i see, thanks alot for your help. i 'll change my codes as you say. i think my problem will solve with this way. thanks again. – Yargicx Apr 25 '20 at 14:25

0 Answers0