0

The problem I have right now is that my webapp needs to generate a spreadsheet for all accounts under a specific user, then email this spreadsheet as a report. Everything is stored in a Mongo databased, and accessed using Mongoid queries. The code is set up like this at the moment:

def some function
  @user = #mongoid query
  # use Axlsx to create a worksheet in a workbook
  @user.get_accounts.each do |u|
    #code to add account information to row
    sheet.add_row(row)
  end
  #return sheet
end

This simple loop works great for most users. However, I have a few users with 100k+ accounts, and as you can imagine, this overloads the system memory and the spreadsheet is never created nor is sent.

I wanted to know if you guys had any suggestions to handle these users with this amount of accounts under them. The general architecture I was thinking of was to chunk it using Sidekiq workers of loops of more manageable size (for example, 100 workers each processing 1000 at a time, or 1000 workers processing 100). Would this be a proper usage of Sidekiq, and if so, is there any documentation I could read up on to figure this out? If not, is there another more efficient way?

manestay
  • 83
  • 1
  • 1
  • 8

1 Answers1

0

You need to paginate through the accounts. Right now it looks like that code will load all 100k+ accounts into memory. ActiveRecord supports the find_each method for automatic pagination.

Mike Perham
  • 21,300
  • 6
  • 59
  • 61
  • I see that paginating through the accounts in the query would be more straightforward than my envisioned method. However, I am using Mongoid, not ActiveRecord. [Would this thread](http://stackoverflow.com/questions/7041224/finding-mongodb-records-in-batches-using-mongoid-ruby-adapter) be what I'm looking for? – manestay Jun 17 '16 at 18:09
  • Right, I don't know how to paginate with Mongoid but that looks relevant, good luck! – Mike Perham Jun 17 '16 at 22:42