This each statement is taking 1+ seconds with around 20000 elements, but it doesn't seem like it should take that long at first glance. The time must come from the field lookup for ping and holder. Aside from pulling holder.created_at out to reduce lookups, is there anything I can do to make this faster?
This code block is given a list from the database of all elements created in the last X time, and in this case I want to get one data point for roughly each 6 minutes.
I do need to check each data point because I will eventually do some more work within the each block that will require looking at each data point, so I can't pull less information from the database.
temp = @modem.pings.where('created_at >= ?', 30.days.ago)
holder = Temp.first
@pings = Array.new
temp.each do |ping|
if (ping.created_at - holder.created_at >= 21600)
@pings << ping
holder = ping
end
end
Edit: I've removed the holder.created_at call from the loop which should reduce the number of lookups by a little less than half, but there has been no decrease in the response time. Am I looking in the wrong place? Does the temp.each do |ping|
line have a high cost such that running it 20000 times takes a while?
Edit2: Learned about code profiling. The actual sql statement doesn't seem to take long. If I run the command and don't do anything with the returned data, it is very fast. But if I do something with the data it returns then I spend about 1/3 of my time in ActiveRecord::AssociationRelation#exec_queries
. After exec_queries
finishes I seem to spend a lot of the rest of my time in Time related functions:
Is there a way that I can avoid doing Time stuff? I believe created_at is stored as an integer, so is there a way to get the number and use that rather than have rails convert it a bunch of times?