8

The backups are 250MB. I don't think that's very big, but it appears the problem is increasing with the size.

Log from the Backup gem below.

Note the time span; about 37 min into the uploading I get connection reset.

[2015/10/30 09:20:40][message] Storage::S3 started transferring '2015.10.30.09.20.01.myapp_postgres.tar' to bucket 'myapp-backups'.
[2015/10/30 09:57:06][error]   ModelError: Backup for Back up PostgreSQL (myapp_postgres) Failed!
[2015/10/30 09:57:06][error]   An Error occured which has caused this Backup to abort before completion.
[2015/10/30 09:57:06][error]   Reason: Excon::Errors::SocketError
[2015/10/30 09:57:06][error]   Connection reset by peer
oma
  • 38,642
  • 11
  • 71
  • 99

2 Answers2

4

Did you try the error handing options, which retransmit the file's portions that have failed :

store_with S3 do |s3|
  s3.max_retries = 10
  s3.retry_waitsec = 30
end

Keep also the chunk size small:

store_with S3 do |s3|
  s3.chunk_size = 5 # MiB
end

You may also want to use the Splitter options.

thesecretmaster
  • 1,950
  • 1
  • 27
  • 39
Renaud Kern
  • 1,116
  • 10
  • 25
  • I've got tests out now with the retry options. I had also set chunk size 4000, think it was a count, not a size metric. – oma Nov 08 '15 at 18:51
  • NoMethodError: undefined method `max_retries=' for # – oma Nov 09 '15 at 11:44
  • Which backup gem version do you have? – Renaud Kern Nov 09 '15 at 11:52
  • I need to work more on it, updating etc and I'm too busy right now (what a cliche). Giving you the bounty Renaud, but won't accept any answer until proven, so I don't mislead anybody. Thanks for trying to help :) – oma Nov 09 '15 at 11:53
  • Renaud, I had to fight alot with chef, throwing knifes and cookbooks around. I'm not a master-chef to be honest :D Finally I got backup gem upgraded. Preliminary results are very positive, I'll keep monitoring and update here for other readers' interests too. – oma Nov 11 '15 at 00:48
  • Thanks oma for your feedback. – Renaud Kern Nov 11 '15 at 08:11
1

I wuold say for now to use ruby-xz to compress in a smaller file in order to send it more compressed and temprary patch it, then try to see

Excon.defaults[:write_timeout] = 500

or more would do the trick

Luca Bruzzone
  • 561
  • 4
  • 15