3

I have more than 15000 POCO elements stored in a Redis List. I'm using ServiceStack in order to save and get them. However, I'm not pleased about the response times that I have when I get them into a grid. As I read , it would be better to store these object in hash - but unfortunately I could not find any good example for my case :(

This is the method I use, in order to get them into my grid

public IEnumerable<BookingRequestGridViewModel> GetAll()
{
    try
    {
        var redisManager = new RedisManagerPool(Global.RedisConnector);                
        using (var redis = redisManager.GetClient())
        {
            var redisEntities = redis.As<BookingRequestModel>();
            var result =redisEntities.Lists["BookingRequests"].GetAll().Select(z=> new BookingRequestGridViewModel
            {
                CreatedDate =z.CreatedDate,
                DropOffBranchName =z.DropOffBranch !=null ? z.DropOffBranch.Name : string.Empty,
                DropOffDate =z.DropOffDate,
                DropOffLocationName = z.DropOffLocation != null ? z.DropOffLocation.Name : string.Empty,
                Id =z.Id.Value,
                Number =z.Number,
                PickupBranchName =z.PickUpBranch !=null ? z.PickUpBranch.Name :string.Empty,
                PickUpDate =z.PickUpDate,
                PickupLocationName = z.PickUpLocation != null ? z.PickUpLocation.Name : string.Empty
            }).OrderBy(z=>z.Id);                   
            return result;
        }
    }
    catch (Exception ex)
    {
        return null;
    }
}

Note that I use redisEntities.Lists["BookingRequests"].GetAll() which is causing performance issues (I would like to use just redisEntities.Lists["BookingRequests"] but I lose last updates from grid - after editing)

I would like to know if saving them into list is a good approach as for me it's very important to have a fast grid (I have now 1 second at paging which is huge).

Please, advice!

Fábio Nascimento
  • 2,644
  • 1
  • 21
  • 27
user_1856538_
  • 149
  • 11
  • what is the *width* of the items, and how big is the network connection? basically, there are 3 places I would start looking here - a) is it bandwidth limited, i.e. are you saturated with 15k*{width}? if so... without changing serialization format, I don't see any options; b) is it latency limited, which would *mostly* only matter if this is a "chatty" API - but I'm *guessing* it is using `lrange`, in which case this shouldn't matter, or c) is it actually serialization performance over 15k elements that is the bottleneck? – Marc Gravell Apr 10 '19 at 08:09
  • random question: why do you need 15k elements in a list? that is almost never useful to users... and would some kind of delayed loading be useful, i.e. a list in "virtual mode", where you load in sub-ranges of the list as the user scrolls? – Marc Gravell Apr 10 '19 at 08:09
  • @Mark Gravel - because I'm planning to replace my SQL Database data with Redis - and this is what I have after importing them into Redis - Also each element is a JSON with arround 20 properties - some of them being other small JSON objects – user_1856538_ Apr 10 '19 at 08:12
  • k; now, I'm very familiar with redis, but I haven't used ServiceStack much (if at all); if you use `monitor` on the server, does it issue a single `lrange`? maybe `lrange BookingRequests 0 -1` ? (the point here being: to determine whether this is "b" from my previous list) – Marc Gravell Apr 10 '19 at 08:17
  • "LRANGE" "BookingRequests" "0" "-1" Indeed – user_1856538_ Apr 10 '19 at 08:22
  • great, that rules out "b" then, and leaves "bandwidth" (=network) or "serializer" (=CPU); can you try `debug object BookingRequests`? in particular, I'm hoping it is going to tell us a `serializedlength` that will give us a hint as to what volume of data we're talking about here – Marc Gravell Apr 10 '19 at 08:33
  • Value at:00007FE12A0B3C90 refcount:1 encoding:linkedlist serializedlength:10082666 lru:11381569 lru_seconds_idle:65 – user_1856538_ Apr 10 '19 at 08:39
  • that's great; that tells me that your data is going to be *in the region of* 10MiB over the network (not exactly the same number, but close enough); 10MiB is big enough to at least be cautious of bandwidth problems, but... it should usually be OK. Now: you say "I have now 1 second at paging which is huge" - can you clarify: do you mean "it takes 1 second to load the grid initially"? or do you mean "it takes 1 second every time the user scrolls"? If the latter... I have a hunch what the problem is; could you try adding `.ToList()` to the end of your query? – Marc Gravell Apr 10 '19 at 08:43
  • 1 second is the loading time for each page . The grid has pagination and I have 40 items per page. So when i click on each page - have 1 sec , even adding ToList() at the end of query – user_1856538_ Apr 10 '19 at 08:48
  • if you're *loading* the page each time you click... why are you grabbing all 15k of them? that makes no sense... just load the page/range you need! If the problem is the sorting: then @mythz has made a good point about sorted sets that should be useful there – Marc Gravell Apr 10 '19 at 08:50
  • That's what I've done before. Instead of putting .GetAll() . But I've had a problem : After editing the record has been removed from my list . Editing for my case meaning : removing the element , and add him afterwards, but the list was not refreshed. I could not find a better way to edit an element into List – user_1856538_ Apr 10 '19 at 08:55

1 Answers1

2

Firstly you should not create a new Redis Client Manager like RedisManagerPool instance each time, there should only be a singleton instance of RedisManagerPool in your App which all clients are resolved from.

But otherwise I would rethink your data access strategy, downloading 15K items in a batch is not an ideal strategy. You can create indexes by storing ids in Sets or you could store items in a sorted set with a value that you can page against like an incrementing id, e.g:

var redisEntities = redis.As<BookingRequestModel>();
var bookings = redisEntities.SortedSets["bookings"];

foreach (var item in new BookingRequestModel[0])
{
    redisEntities.AddItemToSortedSet(bookings, item, item.Id);
}

That way you will be able to fetch them in batches, e.g:

var batch = bookings.GetRangeByLowestScore(fromId, toId, skip, take);
mythz
  • 141,670
  • 29
  • 246
  • 390
  • What if I want to order by a custom field ? I end up using concurrent dictionaries but still have some data inconsistency . I have a service that save the data from concurrent collection to Redis from time to time by checking the booking date and his last run time, but still I'm not sure if this is a proper way to solve the problem... – user_1856538_ Feb 08 '20 at 20:37