2

I have a Redis cluster with maxmemory_human 6.05 G and used_memory_human: 4.62M

I want to fuel up this used_memory_human with dump data so i would have 2G of used_memory_human

How could i do that?

The One
  • 2,261
  • 6
  • 22
  • 38

2 Answers2

3

There's a built-in debug command for that.

debug populate 2000000 testkey 1000

This will create 2 million 1kb string keys.

> debug populate 2000000 testkey 1000
OK
(2.52s)
> scan 0
1) "65536"
2)  1) "testkey:1637732"
    2) "testkey:510112"
    3) "testkey:1313139"
    4) "testkey:34729"
    5) "testkey:734989"
    6) "testkey:996052"
    7) "testkey:223126"
    8) "testkey:1578003"
    9) "testkey:1335698"
   10) "testkey:1151100"
> info memory
# Memory
used_memory:2185489192
used_memory_human:2.04G
used_memory_rss:2247540736
used_memory_rss_human:2.09G
used_memory_peak:2185571088
used_memory_peak_human:2.04G
nnog
  • 1,607
  • 16
  • 23
  • It there a matching command to clean it, not flushing whole database? – Imaskar Jun 08 '18 at 12:32
  • Ah, I think your server version must be quite old. You can run `INFO` command to check. – nnog Jun 11 '18 at 09:03
  • # Server redis_version:3.2.4 – The One Jun 12 '18 at 03:09
  • I went to the github tag, and version 3.2.4 has `populate` subcommand of DEBUG so not sure where your ERR is coming from. – nnog Jun 12 '18 at 16:32
  • @Imaskar Deleting keys by pattern is answered here, you can use this if you don't want to `flushall`: https://stackoverflow.com/questions/4006324/how-to-atomically-delete-keys-matching-a-pattern-using-redis – nnog Jun 12 '18 at 16:36
  • @nnog yeah, I know how to do that. But debug populate is much faster than iterating, so I wondered if there is a matching fast deletion. – Imaskar Jun 12 '18 at 16:42
  • @Imaskar Not that I know of. You could populate a different empty DB (`SELECT`) then do `FLUSHDB` if you want to quickly test growing and shrinking. – nnog Jun 12 '18 at 16:54
  • @The One Just a thought: `DEBUG` and `CONFIG` etc. commands are often renamed to disable them on managed instances, so this is probably why you can't use it on your 3.2.4 instance – nnog Jun 12 '18 at 16:55
  • Got it. Thanks @nnog – The One Jun 18 '18 at 01:13
1

Populate

eval "for i=0,(1024*1024*20) do redis.call('set','testData:'..i,'1234567890') end" 0

used_memory_human:1.81G

Clean

eval "for i=0,(1024*1024*20) do redis.call('del','testData:'..i) end" 0

used_memory_human:574.41K

Imaskar
  • 2,773
  • 24
  • 35
  • It runs for about 45 sec on my machine. – Imaskar Jun 08 '18 at 08:43
  • 1
    Note: this is only valid for non-clustered deployments – Itamar Haber Jun 08 '18 at 11:17
  • How to keep these cache data for like 5 hours? – The One Jun 11 '18 at 06:21
  • @TheOne Change the call part to `redis.call('setex','testData:'..i,5*60*60,'1234567890')` – Imaskar Jun 11 '18 at 06:49
  • Hey @Imaskar , somehow i got error after executing command. It said: Connection refused. The odd thing is: i tried this command for like 50 times, only 1 time worked. – The One Jun 12 '18 at 06:06
  • I meant to modify eval with this call. It would be `eval "for i=0,(1024*1024*20) do redis.call('setex','testData:'..i,5*60*60,'1234567890') end" 0` – Imaskar Jun 12 '18 at 15:08
  • There is also a limit how long LUA can execute until timeout. When you get an error do you wait for a long time before it appears? Then run `config set lua-time-limit 120000` – Imaskar Jun 12 '18 at 20:16
  • @Imaskar yup, of course i added end" 0 at the end of redis.call('setex','testData:'..i,5*60*60,'1234567890') and it was perfectly same as your ↑ command. I'll give it a try with set lua time. – The One Jun 13 '18 at 03:39