intelligent-s3-upload
Upload files or folders (even with subfolders) to Amazon S3 in a totally automatized way taking advantage of:
-
Amazon S3 Multipart Upload: The uploaded files are processed transparently in parts improving the throughput and the quick recovery from any network issues.
-
Resilent Retry System: Intelligent S3 Upload has been built to detect any error during the uploading process and to perform any retries whenever is necessary.
-
User Friendly Interface: Just check the demo to see with your own eyes how the upload process is performed.
Installation
Clone the repository
$ git clone https://github.com/polius/intelligent-s3-upload.git
Install the dependencies
$ pip3 install boto3 --user
Setup
Before executing the Intelligent S3 Upload, modify the credentials.json file.
{
"aws_access_key_id": "",
"aws_secret_access_key": "",
"region_name": "",
"bucket_name": "",
"bucket_prefix": "",
"storage_class": "",
"skip_s3_existing_files": true
}
- aws_access_key_id | aws_secret_access_key: Credentials generated by Amazon IAM.
- region_name: The AWS Region Code where the bucket is located.
- bucket_name: The bucket name created by Amazon S3.
- bucket_path: (Optional) The bucket path to store the uploaded objects.
- storage_class: The type of storage to use for the uploaded object. These are the possible values:
storage_class |
---|
STANDARD |
REDUCED_REDUNDANCY |
STANDARD_IA |
ONEZONE_IA |
INTELLIGENT_TIERING |
GLACIER |
DEEP_ARCHIVE |
- skip_s3_existing_files: Skip uploading objects if these already exists in S3. Possible values: [ true | false ]
Execution
$ python3 upload.py --path "{PATH}"
Replace the string {PATH} with the absolute file/folder path to upload to Amazon S3.