18

I'm writing code in Laravel 5 to periodically backup a MySQL database. My code thus far looks like this:

    $filename = 'database_backup_'.date('G_a_m_d_y').'.sql';
    $destination = storage_path() . '/backups/';

    $database = \Config::get('database.connections.mysql.database');
    $username = \Config::get('database.connections.mysql.username');
    $password = \Config::get('database.connections.mysql.password');

    $sql = "mysqldump $database --password=$password --user=$username --single-transaction >$destination" . $filename;

    $result = exec($sql, $output); // TODO: check $result

    // Copy database dump to S3

    $disk = \Storage::disk('s3');

    // ????????????????????????????????
    //  What goes here?
    // ????????????????????????????????

I've seen solutions online that would suggest I do something like:

$disk->put('my/bucket/' . $filename, file_get_contents($destination . $filename));

However, for large files, isn't it wasteful to use file_get_contents()? Are there any better solutions?

Marcin Nabiałek
  • 109,655
  • 42
  • 258
  • 291
clone45
  • 8,952
  • 6
  • 35
  • 43
  • This is a great question and is my goal right now too. I'll now look into https://tuts.codingo.me/laravel-backup-amazon-s3 (which looks promising) and also the suggestion below from @user4603841. – Ryan May 26 '17 at 14:14

6 Answers6

20

There is a way to copy files without needing to load the file contents into memory using MountManager.

You will also need to import the following:

use League\Flysystem\MountManager;

Now you can copy the file like so:

$mountManager = new MountManager([
    's3' => \Storage::disk('s3')->getDriver(),
    'local' => \Storage::disk('local')->getDriver(),
]);
$mountManager->copy('s3://path/to/file.txt', 'local://path/to/output/file.txt');
whoacowboy
  • 6,982
  • 6
  • 44
  • 78
theHarvester
  • 563
  • 1
  • 4
  • 11
12

You can always use a file resource to stream the file (advisable for large files) by doing something like this:

Storage::disk('s3')->put('my/bucket/' . $filename, fopen('path/to/local/file', 'r+'));

An alternative suggestion is proposed here. It uses Laravel's Storage facade to read the stream. The basic idea is something like this:

    $inputStream = Storage::disk('local')->getDriver()->readStream('/path/to/file');
    $destination = Storage::disk('s3')->getDriver()->getAdapter()->getPathPrefix().'/my/bucket/';
    Storage::disk('s3')->getDriver()->putStream($destination, $inputStream);
user4603841
  • 1,246
  • 9
  • 8
  • 1
    I get only "A non well formed numeric value encountered". Can find solution yet –  Jun 16 '17 at 16:23
10

You can try this code

$contents = Storage::get($file);
Storage::disk('s3')->put($newfile,$contents);

As Laravel document this is the easy way I found to copy data between two disks

vipmaa
  • 1,022
  • 16
  • 25
7

Laravel has now putFile and putFileAs method to allow stream of file.

Automatic Streaming

If you would like Laravel to automatically manage streaming a given file to your storage location, you may use the putFile or putFileAs method. This method accepts either a Illuminate\Http\File or Illuminate\Http\UploadedFile instance and will automatically stream the file to your desired location:

use Illuminate\Http\File;
use Illuminate\Support\Facades\Storage;

// Automatically generate a unique ID for file name...
Storage::putFile('photos', new File('/path/to/photo'));

// Manually specify a file name...
Storage::putFileAs('photos', new File('/path/to/photo'), 'photo.jpg');

Link to doc: https://laravel.com/docs/5.8/filesystem (Automatic Streaming)

Hope it helps

itod
  • 104
  • 1
  • 4
2

Looking at the documentation the only way is using method put which needs file content. There is no method to copy file between 2 file systems so probably the solution you gave is at the moment the only one.

If you think about it, finally when copying file from local file system to s3, you need to have file content to put it in S3, so indeed it's not so wasteful in my opinion.

Marcin Nabiałek
  • 109,655
  • 42
  • 258
  • 291
  • 1
    Thanks Marcin. I felt that it might be wasteful because file_get_contents will pull the entire file into memory before it is sent to S3. I was hoping there would be a solution where the file would get streamed to S3 from the local file. This isn't a big deal if your file is only 1 or 2 megs, but I could see memory running out for larger files. All of this is speculation on my part, so take it with a grain of salt. – clone45 Apr 09 '15 at 18:38
  • There is a way to do this via Flysystem on Laravel 5 using streams I believe. It may be worth hitting up Frank de Jonge on how to do to that (author of Flysystem). Basically, you open up a file via a stream, pull the content in and at the same time push that content to a file on S3. This saves having to load the entire file in memory, which is great for many reasons. – Oddman Oct 06 '15 at 17:44
  • @MarcinNabiałek I didn't mean to insult you and I apologize if that was bad etiquette. I updated the "accepted answer" so that developers who see this posting focus on itod's answer, which is now probably the best approach. – clone45 Jan 22 '20 at 21:24
  • @clone45 Yes, I understand, but at time of asking probably there was not yet Laravel 5.8 and this method didn't work. If in Laravel 15 it will be changed and someone answers, then you will again change accepted answer to this one? – Marcin Nabiałek Jan 22 '20 at 22:03
  • @MarcinNabiałek Please feel free to email me at my clone45 gmail address if you want to talk this through. I see arguments on both sides online. – clone45 Jan 23 '20 at 23:04
0

I solved it in the following way:

$contents = \File::get($destination);
\Storage::disk('s3')
    ->put($s3Destination,$contents);

Sometimes we don't get the data using $contents = Storage::get($file); - storage function so we have to give root path of the data using Laravel File instead of storage path using Storage.

Rajkumar R
  • 1,097
  • 8
  • 14