2

I have a rake task and when I run it in console, it is killed. This rake task operates with a table of cca 40.000 rows, I guess that may be a problem with Out of memory.

Also, I believe that this query used is optimized for dealing with long tables:

MyModel.where(:processed => false).pluck(:attribute_for_analysis).find_each(:batch_size => 100)   do |a|

# deal with 40000 rows and only attribute `attribute_for_analysis`.

end

This task will not be run in the future on regular basis, so I want to avoid some job monitoring solutions like God etc...but considering background jobs e.g.Rescue job.

I work with Ubuntu, ruby 2.0 and rails 3.2.14

> My free memory is as follows: 


Mem:                total      used       free     shared    buffers     cached  
                    3891076    1901532    1989544  0         1240        368128
-/+ buffers/cache:  1532164    2358912
Swap:               4035580     507108    3528472

QUESTIONS:

  1. How to investigate why rake task is always killed (answered)
  2. How to make this rake task running ( not answered - still is killed )
  3. What is the difference between total-vm, aton-rs, file-rss (not answered)

UPDATE 1

-Can someone explain the difference between?:

  • total-vm
  • anon-rss
  • file-rss
$ grep "Killed process" /var/log/syslog 

Dec 25 13:31:14 Lenovo-G580 kernel: [15692.810010] Killed process 10017 (ruby) total-vm:5605064kB, anon-rss:3126296kB, file-rss:988kB
Dec 25 13:56:44 Lenovo-G580 kernel: [17221.484357] Killed process 10308 (ruby) total-vm:5832176kB, anon-rss:3190528kB, file-rss:1092kB
Dec 25 13:56:44 Lenovo-G580 kernel: [17221.498432] Killed process 10334 (ruby-timer-thr) total-vm:5832176kB, anon-rss:3190536kB, file-rss:1092kB
Dec 25 15:03:50 Lenovo-G580 kernel: [21243.138675] Killed process 11586 (ruby) total-vm:5547856kB, anon-rss:3085052kB, file-rss:1008kB

UPDATE 2

  • modified query like this and rake task is still killed.
 MyModel.where(:processed => false).find_in_batches   do |group|   
 p system("free -k")
 group.each do |row|    # process
   end 
 end
  • could you insert counter into loop and some memory minitor with `system` method? – SergeyKutsko Dec 25 '13 at 11:03
  • `MyModel.where(:processed => false).pluck(:attribute_for_analysis).find_in_batches(:batch_size => 1000) do |group| p system("free -k") group.each { |model| #your stuff } end` – SergeyKutsko Dec 25 '13 at 11:49
  • 1
    Try find in logs if it really was OOM killer - http://stackoverflow.com/questions/624857/finding-which-process-was-killed-by-linux-oom-killer – cutalion Dec 25 '13 at 13:39
  • Also, I don't think your code does what you want. `pluck` returns an array, which doesn't have `find_each` method. – cutalion Dec 25 '13 at 13:41
  • @cutalion updated question for my `log`..hm but I am note what the message means ... –  Dec 25 '13 at 14:50
  • print output of Update 2 – SergeyKutsko Dec 25 '13 at 15:30
  • there is no output, rake task is killed.(because of memory.Now `cached` is cca 1304M –  Dec 25 '13 at 15:42
  • @maro30, could you post all the code? Have you tried run it on smaller amount of records? `limit(1)`, `limit(10)`, `limit(100)`, `limit(1000)`. How soon task gets killed after start? – cutalion Dec 25 '13 at 18:19

0 Answers0