1

I have successfully implemented a combination of redis and mysql. At one section of my application I thought I would reduce load on mysql server and use redis until the data gets changed, however I observe that it's still faster when the same data is fetched from Mysql than redis.

Here is scenario.

User1: 10,000 records with seldom one off change in a day or so.

What I do is whole object that fetches these 10K records, (serialized object of about 20mb in size) is saved to redis. The idea is that since subsequent 100 to 1000 requests will only be just page refreshes so why not avoid mysql hits and get this data from redis.

However, I have observed that when this object is fetched from Redis, it takes more time than when I flush redis and the mysql query is hit. I thought redis would have been faster or have the same mysql-like speed at least, but here it's different.

Also, I have observed that while fetching the 20mb object from Redis my php gives the "allowed memory exhausted" error (which I know how to fix) but it doesn't give out any error when the same data is fetched from mysql.

Could it be that Redis is not fit for caching huge objects? Or is there something else?

Thanks

Ersoy
  • 8,816
  • 6
  • 34
  • 48
Kashif
  • 495
  • 1
  • 5
  • 16
  • So you store all 10k records in 1 object, then when you want the data you fetch this object, unserialize it and then extract the data you want. This is probably the reason why it would take so long. Have you tried storing each record as a separate item in redis instead. – Nigel Ren Jun 18 '20 at 10:37
  • I thought about that but then I have these table relationships and I thought it would extremely over complicate things, or am I wrong in thinking that? – Kashif Jun 18 '20 at 10:52

1 Answers1

3

Redis is not designed/fit for storing/fetching large objects. You will suffer while getting/setting these objects in network/bandwidth. Since it is single threaded, other requests also will suffer while storing/fetching these large objects.

As it is stated in the benchmark documentation;

Speed of RAM and memory bandwidth seem less critical for global performance especially for small objects. For large objects (>10 KB), it may become noticeable though. Usually, it is not really cost-effective to buy expensive fast memory modules to optimize Redis.

What you may do is, redesign your cache layer with hashes or lists depending on the query requirements of your application. You may check here for additional information. Also i answered a similar question in here

Ersoy
  • 8,816
  • 6
  • 34
  • 48
  • 1
    Thanks. I will go through these links now. The problem with storing these 10K items separately is that this table depends on two other tables for product name and product type. So I thought the easiest way to do would be to just store the whole object at once. – Kashif Jun 18 '20 at 10:54
  • 2
    @Kashif if you are using Eloquent's caching then the size would be huge - but if you store json then it will be smaller - also you may get rid of "unused" columns if you don't need it. If you are going to use in front end then "pagination" would help - also you may get them with pagination from Redis (sorted sets can be helpful) – Ersoy Jun 18 '20 at 10:59
  • 1
    Thank you for your great insight and eloquent answers. :) – Kashif Jun 20 '20 at 14:28