1

I'm trying to build a page that ranks users based on how many views their profile has. Since that page might be getting a lot of hits, I cache the ordered users and invalidate that cache each time a profile gets a view. With only 9 users and ~200 page views a day, my app passed Heroku's memory cap of 512mb. New Relic confirmed this, showing me that the user listing was taking an inordinate amount of time:

Slowest Components          Duration        %
-----------------------------------------------
UsersController#index       2,125 ms        80%
users/index.html.erb        506 ms          19%
Memcache get                20 ms           1%
User#find                   18 ms           1%
layouts/_header.html.erb    1 ms            0%
User#find_by_sql            0 ms            0%

Some reading told me that ActiveRecord apparently does not hand back memory to the OS after a request. Looking UsersController#index, I can see how this might cause a problem if the memory allocated by User.order was not freed.

UserController#index:

require 'will_paginate/array'

class UsersController < ApplicationController
  PER_PAGE = 20

  def index
    @users = Rails.cache.read("users")
    if @users.nil?
      # .all is used to force the query to happen now, so that the result set is cached instead of the query
      @users = User.order("views DESC").all
      Rails.cache.write("users", @users)
    end

    @users = @users.paginate(page: params[:page], per_page: PER_PAGE)

    if params[:page].nil? || params[:page] == "1"
      @rank = 1
    else
      @title = "Page #{params[:page]}"
      @rank = (params[:page].to_i - 1) * PER_PAGE + 1
    end
  end
end

index.html.erb

<% @users.each do |user| %>
  image_tag user.profile_picture, alt: user.name
  <h3><%= @rank %></h3>
  <p><%= user.name %></p>
  <p><%= user.views %></p>
  <% @rank += 1 %>
<% end %>
<%= will_paginate %>

I don't know how I'm supposed to get around this though. I thought of possibly only pulling up one page of users into memory at a time, but with only 9 users, that wouldn't really change anything since a max of 20 is supposed to be listed per page. Do I have to manually clear @users from memory after every request, or is my approach in UsersController#index just wrong?

Any help would be greatly appreciated.

user1650177
  • 445
  • 3
  • 10
  • It isn't that ActiveRecord doesn't release memory to the OS, it is that *Ruby* does not release memory back to the OS. But then most programs/allocators do not .. this is not generally considered a "memory leak", however. Ruby, being a GC environment, will reclaim (*internal*) memory for objects at some point after said objects are no longer strongly reachable. Unless the process memory usage *grows without bounds* .. –  Sep 09 '12 at 22:09
  • Remember that variables != objects, and that @instanceVariables are associated with an *instance*. –  Sep 09 '12 at 22:16
  • So if I'm understanding this correctly, memory occupied by all my instance variables should automatically be reclaimed by ruby after the request? The memory my app is using has still been growing linearly though (100mb -> 300mb). Could storing @users in cache somehow prevent it from being de-allocated by Ruby? – user1650177 Sep 09 '12 at 22:35
  • Side note, use Rails.cache.fetch – apneadiving Sep 09 '12 at 23:12
  • Variables are not objects. A variable can maintain a *strong reference* to an object; if there is a *strong reference* to the object that has that variable and so on until a *root* then the object referred to by the variable is *strongly reachable*. So if the *only* strong reference to said object(s) in users is, users, then when the object for which users is an instance variable of is no longer strongly reachable, by extension, the objects in users are no longer strongly reachable. (This came out sounding more complicated than it is ..) –  Sep 10 '12 at 00:50
  • Do you really mean to be loading all the users from your db on every request? – Frederick Cheung Sep 10 '12 at 21:34
  • pst - yeah, I get that, but I don't really see how that would apply to the problem I'm having. Guess I'll go read some more rails stuff. – user1650177 Sep 11 '12 at 03:54
  • Frederick Cheung - Yeah, I was planning on refactoring that later. But that really shouldn't matter unless I add users to the point where loading all of them into memory at once would be problematic, right? – user1650177 Sep 11 '12 at 03:58
  • Wish I could close this! Issue was unrelated to my code but with the newrelic monitoring tool I was using. My other post where I'd narrowed down the problem: http://stackoverflow.com/questions/12397208/memory-grows-indefinitely-in-an-empty-rails-app – user1650177 Aug 05 '15 at 15:29

0 Answers0