0

I'm new to python and trying to recursively upload the gz contents of all of the tar files in a directory and the directories inside of it to s3. This is around 1,350,000 tar files in total.

I don't have the space to untar all of the files at once so I am doing one of them at a time.

Originally my script worked but once it hit an error (tar file is corrupt) my script ends. I added a bunch of try except clauses to try and record and continue on these errors and now my script doesn't seem to be uploading the files even though I am getting output like the following::

('iter: ', '/srv/nfs/storage/bucket/2015/11/20/KFWS/NWS_NEXRAD_NXL2DP_KFWS_20151120130000_20151120135959.tar')
KFWS20151120_130030_V06.gz
('singlepart', <Bucket: bucket>, '/srv/nfs/storage/bucket/2015/11/20/KFWS/KFWS20151120_130030_V06.gz', '2015/11/20/KFWS/KFWS20151120_130030_V06.gz')
('single_part: ', '2015/11/20/KFWS/KFWS20151120_130030_V06.gz', '/srv/nfs/storage/bucket/2015/11/20/KFWS/KFWS20151120_130030_V06.gz')
KFWS20151120_131000_V06.gz
('iter: ', '/srv/nfs/storage/bucket/2015/11/20/KFWS/NWS_NEXRAD_NXL2DP_KFWS_20151120110000_20151120115959.tar')
KFWS20151120_110630_V06.gz
('singlepart', <Bucket: bucket>, '/srv/nfs/storage/bucket/2015/11/20/KFWS/KFWS20151120_110630_V06.gz', '2015/11/20/KFWS/KFWS20151120_110630_V06.gz')
('single_part: ', '2015/11/20/KFWS/KFWS20151120_110630_V06.gz', '/srv/nfs/storage/bucket/2015/11/20/KFWS/KFWS20151120_110630_V06.gz')
KFWS20151120_111601_V06.gz

It shows that it is getting to single_part which, to me, means it's, at least, running the singlept function and trying to upload an object but neither Zimport_errors.list or Znoaa_nexrad_files.list are ever created and I do not see any new objects in the bucket.

The code below: (sorry in advance for how gross it is. I'm trying to teach myself python and only a few weeks in.)

http://pastebin.com/X56FHDaa

Here is the main block though

def singlept(bucket, keyname, local_file):
    retries = 0
    key_size = 0
    local_size = os.path.getsize(local_file)
    while retries <= 4 and local_size != key_size:
        local_md5 = md5file(local_file=local_file)
        print('single_part: ', keyname, local_file)
        try:
            key = bucket.new_key(keyname)
        except Exception:
            print('couldn\'t create key: ', keyname)
            pass
        try:
            key.set_contents_from_filename(local_file)
            key_size = key.size
            with open(successfile, 'ab') as f:
                f.write('\n')
                f.write(str(local_file + ',' + keyname + ',' + str(key_size) + ',' + str(local_size)))
        except Exception:
            print('couldn\'t upload file: ', local_file, ' as key: ', keyname)
            with open(errorfile, 'ab') as f:
                f.write('\n')
                f.write(str(local_file + ',' + keyname + ',' + str(key_size) + ',' + str(local_size)))
            pass


for dir, subdir, files in os.walk(local_bucket):
    s3path = "/".join(str(dir).split('/')[5:])
    local_path = str(local_bucket + '/' + s3path)
    for fname in files:
        if fname.endswith("tar"):
            fullpath = local_path + '/' + fname
            if (debug):
                print('iter: ',fullpath)
            with tarfile.open(fullpath, 'r') as tarball:
                zips = tarball.getmembers()
                try:
                    tarball.extractall(path=local_path)
                except Exception:
                    with open(errorfile, 'ab') as f:
                        f.write('\n')
                        f.write(str(fullpath + ',' + str(os.path.getsize(fullpath))))
                    continue
            for zip in zips:
                if (debug):
                    print(zip.name)
                local_file = local_path + '/' + zip.name
                keyname = s3path + '/' + zip.name
                try:
                    if zip.size >= 1073741824:
                        if (debug):
                            print('multipart',bucket, local_file, keyname)
                        multipt(bucket, local_file, keyname)
                    else:
                        if (debug):
                            print('singlepart',bucket, local_file, keyname)
                        singlept(bucket, keyname, local_file)
                except Exception:
                    with open(errorfile, 'ab') as f:
                        f.write('\n')
                        f.write(str(local_file + "," + keyname))
                    continue
                if local_file.endswith("gz"):
                    try:
                        os.remove(local_file)
                    except Exception:
                        print('couldn\'t remove file: ', local_file)
                        continue

Thank you so much in advance for any help! I'm pulling my hair out!

Edit -- added code directly and hopefully fixed indents! It looks right in Atom but not pasting correctly. :-/

Lookcrabs
  • 43
  • 6

1 Answers1

0

except Exception only catches exceptions of type Exception - see https://stackoverflow.com/a/18982726/264822. You should try:

   try:
        key = bucket.new_key(keyname)
    except:
        print('couldn\'t create key: ', keyname)
        pass
parsley72
  • 8,449
  • 8
  • 65
  • 98