11

I'm trying to allow for large uploads when running Unicorn on Heroku with Rails but I've realised that any large uploads might take longer than the timeout period for a Unicorn worker. This will mean (I've seen this happen) that the Unicorn master process will kill the worker uploading the large file and the request will timeout (with a 503 error).

Without removing or massively increasing the timeout for my server is there any way to allow the uploading worker to hang while the upload completes? Or, am I completely misunderstanding and its most likely something else that is causing my uploads to timeout?

seadowg
  • 4,215
  • 6
  • 35
  • 43

2 Answers2

8

If you're using nginx as a reverse proxy in front of your unicorns, you can use the Upload Module. When configured, nginx handles the upload and stores it in a /tmp directory, then your unicorn gets request params telling it where the uploaded asset is and it's content type. No more worker tied up receiving the upload.

If you don't really want the upload on the same server as your web service, but rather store it in S3, you should follow @Neil Middleton's suggestion and set things up so the upload goes directly there.

dbenhur
  • 20,008
  • 4
  • 48
  • 45
7

If you're uploading to S3, then you can "simply" have the user upload files direct to S3 instead of via your dynos, and get pinged when the upload is complete.

For significantly more information than this, check out something like CarrierWaveDirect

Neil Middleton
  • 22,105
  • 18
  • 80
  • 134
  • I am using CarrierWaveDirect, but I still had to increase the timeout of the unicorn workers, because they will crash if the file upload takes too long, even if you are using CarrierWaveDirect. – cantonic Feb 27 '13 at 21:12