My question is relevant to copy files from one AWS s3 bucket/folder to another AWS/S3 folder and also keep the deepest sub-folder name by pythons on databricks.
Is it possible to have a user upload from their local a folder structure containing all the folders/files temporarily, modify the object key names/paths (move, copy, delete?), then permanently upload to a S3 bucket? I'm mainly looking for better resources to read and learn.
I'm developing a web application, with 3rd party using Python, where a user can upload a single file OR single folder of files OR single folder of folders/files. Then they can place these uploads into user defined categories/subcategories via metadata.
Regarding single file(s), I understand AWS will create a folder in the object path. That main root 'folder' name will need to be changed for all three upload scenarios.
Single File(s) Example:
Folder Name A/File A,B,C
-Rename Folder Name A.
Regarding single folder of folders/files, the main root folder needs to be eliminated leaving 1st level folders as individual single folders moving into the permanent S3. This is a bulk option to upload many folders at once versus 'single folder'. The 'folder structure' downstream of the 1st level folders needs to stay intact and does not need to be altered.
Upload single folder of folders/files example:
Local Root Folder/Local Folder Name A/.../File A,B,C
Local Root Folder/Local Folder Name 1/.../File 1,2,3
Local Root Folder/Local Folder Name X/.../File X,Y,Z
-Eliminate *Local Root Folder* and rename *Local Folder Names A & 1 & X*
Any assistance with documentation and AWS features that could help explain the fundamentals behind this challenge would be much appreciated! Thanks!