I am using delayed_job to perform as simple task: Open a cvs file(it contains a list of emails) and print every line to a debug log.
Since the file is big, that should take some time.
I enqueue the job with CSVJob.perform_later("test")
This is the active_job:
class CVSJob < ApplicationJob
queue_as :default
def perform(job_name)
job_logger = Logger.new("#{Rails.root}/log/delayed_job.log")
[..code omitted...]
CSV.foreach(file_path, headers: true, col_sep: ",") do |row|
sleep 0.05
job_logger.debug "------------------------------->"
job_logger.debug row.inspect
end
end
end
Now, if I login into the rails console and type:
>Delayed::Job.all
I can see the job running.
But when I delete the job with Delayed::Job.delete_all and no job appears in the delayed_jobs table, the job is still running.
Or, I can find the job id of the job and i can do a simple job.destroy! and the job is deleted from the table.
But(!!) the job is still running! because I see the debug log still executing by doing a simple
tail -f delayed_job.log
Job stops performing when it goes through the whole csv file.
The only way to really "kill" the job is to restart the delayed_job worker. But this defeats the whole purpose of controlling the execution of the job in every aspect.
Is there something that I am doing wrong?
Thank you in advance