Even though I have a pool of 50 in my Rails application, I also wrote a script that handles tasks periodically using the popular daemons gem. This is what it looks like:
class Responder
def initialize
@queue = Queue.new
end
# add to queue
def produce(msg)
@queue << msg
end
# take from queue
def consume
Thread.new do
loop do
sleep(1)
if !@queue.empty?
data = @queue.pop
process(data)
end
end
end
end
end
class EmailResponder < Responder
def process(message)
Alert.where(id: message[:id]).send_mail
end
end
class GeocodeResponder < Responder
def process(message)
Report.where(id: message[:id]).geocode_data
end
end
class RedisListener
def initialize(host,port)
@host = host
@port = port
@email_sms = EmailResponder.new
@geocode = GeocodeResponder.new
# timeout so we wait for messages forever
@redis = Redis.new(:host => @host, :port => @port, :timeout => 0)
end
def start_producers
thread = Thread.new do
@redis.subscribe('juggernaut') do |on|
on.message do |event, msg|
@email_sms.produce(msg)
@geocode.produce(msg)
end
end
end
end
def start_consumers
@email_sms.consume
@sidekiq.consume
end
end
listener = RedisListener.new('127.0.0.1', 6379)
listener.start_producers
listener.start_consumers
The problem is a lot of items are coming through redis so the queue builds up, and I end up using more and more database connections to the point that it crashes with postgresql max connections reached. I don't want to limit size of queue, otherwise I risk losing data that comes through redis on the fly. I'd rather let the queue grow and grow and just actually limit the database connections. How can I limit database connections in this Rails daemon (so when I use ActiveRecord objects like Alert.where(...) or Report.where(...) it will just block until db connection is free)?
I tried adding this to the script:
ActiveRecord::Base.configurations['production']['pool'] = 10
But it seems to have no effect.