23

How do I list all the files in a specific S3 "directory" using Fog?

I know that S3 doesn't store files in folders but I need a way to limit the returned files to specific "folder" instead of retrieving the entire list in the bucket.

Gerry Shaw
  • 9,178
  • 5
  • 41
  • 45

1 Answers1

41

Use the prefix option on the directory.get method. Example:

def get_files(path, options)
  connection = Fog::Storage.new(
    provider: 'AWS',
    aws_access_key_id: options[:key],
    aws_secret_access_key: options[:secret]
  )
  connection.directories.get(options[:bucket], prefix: path).files.map do |file|
    file.key
  end
end
Gerry Shaw
  • 9,178
  • 5
  • 41
  • 45
  • 1
    It's probably worth noting that `prefix` is _actually_ suffix, at least structurally speaking. If the path to your nested bucket is 'foo/bar', then your method call would be: `.get('foo', prefix: 'bar')`. – pdoherty926 Jul 17 '15 at 02:19
  • pdoherty926’s note is a bit confusing if you don’t think of the bucket name as being part of the path. – Amir Apr 25 '16 at 14:21
  • 2
    There is an edge case here that calling `.map` will not return ALL, but only a single page as returned by the AWS api. Calling .each on the files will let Fog manage memory consumption as there could be ALOT of files. – rposborne Jun 14 '16 at 19:14
  • 1
    @rposborne good point. I think in the case of a very large folder using .each and passing a block for what you would like to do with the file would be the best pattern to handle this scenario. – Gerry Shaw Jun 14 '16 at 21:14
  • Hi, I know this is old, but does anyone know how to control the sort order? I am use UUIDs for my file names and realizing it comes back alphabetical not time uploaded :( – chrisallick Nov 14 '16 at 19:07
  • Make sure to omit the leading `/` from the `prefix` path - doing so returns a seemingly empty directory. – Ollie Bennett Jan 02 '19 at 17:48