So I have a rails app that parses a long json file - during which it sends off requests to an external API, which seem to take forever. My question is, is it as simple as to stick each request into a separate thread in order to speed it up? I don't see why I can't have say 3 or 4 threads running concurrently sending off requests ... the only issue I see is currently the method displays an errors page if there are any errors present at the end of parsing - and this possibly couldn't happen as the other threads might still be running? .. eventually I want to make this method into a rake task and run it using a cron job that renders a html page so that should be better correct?
def load_movies
@errors = []
movie_list = get_data
movie_list.first(50).each do |movie_data|
------------ Thread.new do -----------
movie = Movie.new
movie.construct_movie movie_data
if !movie.valid?
@errors << movie
else
movie.save
end
----------- end thread ------------
unless @errors.count > 0
redirect_to root_path
end
end
is it as simple as to do what is above and it will run off open multiple threads to send the requests (which can be assumed to be in construct_movie