Here's an example script I incorporated into a seed.rb file for one of my projects.
I'm sure it can be improved but it provides a good working example.
All the assets I'm pulling are stored within the app/assets/images and they have names matching the names of my Info objects (after I replace spaces with underscores and downcase the names).
Yes it does sound inefficient, but apart from putting those assets on an FTP somehwhere, it's the best solution I found for my remote server to be able to upload the files straight to S3 using Carrierwave and Fog.
My Info model has a has_one
association to a Gallery model, which has a has_many
association to a Photo model. The Carrierwave uploader is mounted on the 'file' (string) column of that model.
Info.all.each do |info|
info_name = info.name.downcase.gsub(' ', '_')
directory = File.join(Rails.root, "app/assets/images/infos/stock/#{info_name}")
# making sure the directory for this service exists
if File.directory?(directory)
gallery = info.create_gallery
Dir.foreach(directory) do |item|
next if item == '.' or item == '..'
# do work on real items
image = Photo.create!(gallery_id: gallery.id)
image.file.store!(File.open(File.join(directory, item)))
gallery.photos << image
end
info.save!
end
end
This works flawlessly for me, but ideally I wouldn't have to package the files that I'm uploading to S3 within the assets folder. I'm more than open to suggestions & improvements.