7

I'm trying to run this aws s3 ls command:

aws s3 ls s3://path/to/my/bucket/12434 --recursive --human-readable --summarize

with this python:

command = 'aws s3 ls s3://path/to/my/bucket/12434 --recursive --human-readable --summarize'
s3_folder_data  = subprocess.check_output(command, stderr=subprocess.STDOUT, shell=True)
print s3_folder_data

But it's failing with this error:

subprocess.CalledProcessError: Command 'aws s3 ls s3://path/to/my/bucket/12434 --recursive --human-readable --summarize' returned non-zero exit status 1

The command itself works when I run it. The python script is being called by the same user on the same machine. What gives?

midori
  • 4,807
  • 5
  • 34
  • 62
BarFooBar
  • 1,086
  • 2
  • 13
  • 32
  • 1
    there is a python api for aws called boto2, use it instead – midori Oct 25 '16 at 17:17
  • here is the link for s3 - http://boto.cloudhackers.com/en/latest/ref/s3.html – midori Oct 25 '16 at 17:19
  • Thanks, I'm aware. But this was supposed to be a quick and dirty script. I don't want to go through the trouble of configuring boto if I can just use the CLI. – BarFooBar Oct 25 '16 at 17:21
  • there is no trouble, that's the thing, it's pretty simple to use, btw check out the other question i linked – midori Oct 25 '16 at 17:22
  • Also, that question tells me nothing about why that aws command would return with a nonzero status when it works after running it in the same environment as the same user. – BarFooBar Oct 25 '16 at 17:23
  • it tells everything and shows how to check it deeper but it's up to you to use it – midori Oct 25 '16 at 17:25
  • Boto does not have an equivalent command to `s3 ls`. Getting byte size of buckets is non-trivial. – BarFooBar Oct 25 '16 at 17:29
  • 1
    are you kidding me? get_bucket() and list it – midori Oct 25 '16 at 17:30
  • @midori : the list on boto s3 module has a limit for 1000 (also delete ) – Shankar May 25 '20 at 13:25

2 Answers2

11

As suggested by others, use Boto3 S3 library to get what you want. But if you insist on subprocess, try:

subprocess.check_output(['aws', 's3', 'ls', 's3://path/to/my/bucket/12434', '--recursive', '--human-readable', '--summarize'])

or

subprocess.call(['aws', 's3', 'ls', 's3://path/to/my/bucket/12434', '--recursive', '--human-readable', '--summarize'])

and build on it.

helloV
  • 50,176
  • 7
  • 137
  • 145
5

New in Python 3.5, you can also use subprocess.run().

subprocess.run(['aws', 's3', 'ls', 's3://path/to/my/bucket/12434', '--recursive', '--human-readable', '--summarize'])
jkdev
  • 11,360
  • 15
  • 54
  • 77