2

Can someone please help me with authentication while moving the data from hdfs to S3. To connect to S3, I am generating session based credentials using aws_key_gen (access_key, secret_key, and session based token)

I tested, distcp works fine with permanent access and secret key, but giving problem with session based key. Also, I tested session based credentials using python, and was able to list the directories.

Attached is the code (keys and bucket name changed).

-----------------------------------
--Below python code working fine
----------------------------------- 
python
import boto3
session = boto3.Session(
    aws_access_key_id='ASIA123456789G7MPE3N',
    aws_secret_access_key='Nsz7d05123o456o789o0UdVRQWa7y7i3MED2L6/u',
    aws_session_token='FQoGZXIvYXdzEPr//////////wEa123o345o567ohytzmnAAj7YnHgnjfhAmrsdUmTFRZSoZmgeIKKF3dY+/ZzQadteRRSitmq+/llvnWBlA1WHfneMaN/yOawAAO2aSLjkLXZsC2G0Gtt+dcmS9zhy8ye+FfDppODc3yiBoYfpmOuXfMqbyDt3nnYe3Hlq44DWS7wqIb72X+s2ebiNghNWxyD1VJM1qT68/OIUYrjarNDGWhDCKRU21Sjqk4FWgwSUX5f5cIoTwvnhAkFwwD8TIRt5sFgMEfDrBjIj22oILF5xrfaDRr3hc3dLKb7jZUxMWWSCbQZXA5sGE78/UazA8ufEAKPVkWdYi+q39RvR9K2mjrWD1jc6cCrj+ScWCJ+CfWcoVev/QtHqu4WHYfORfinuZUEHLOTIwU/Gz83UdQ1KMvi39wF'
)
s3=session.resource('s3')
my_bucket = s3.Bucket('mybucket')
for object in my_bucket.objects.all():
    print(object)   


-----------------------------------------
--Below distcp is giving forbidden error
-----------------------------------------

AWS_ACCESS_KEY_ID='ASIA123456789G7MPE3N'
AWS_SECRET_ACCESS_KEY='Nsz7d05123o456o789o0UdVRQWa7y7i3MED2L6/u'
AWS_SESSION_TOKEN='FQoGZXIvYXdzEPr//////////wEa123o345o567ohytzmnAAj7YnHgnjfhAmrsdUmTFRZSoZmgeIKKF3dY+/ZzQadteRRSitmq+/llvnWBlA1WHfneMaN/yOawAAO2aSLjkLXZsC2G0Gtt+dcmS9zhy8ye+FfDppODc3yiBoYfpmOuXfMqbyDt3nnYe3Hlq44DWS7wqIb72X+s2ebiNghNWxyD1VJM1qT68/OIUYrjarNDGWhDCKRU21Sjqk4FWgwSUX5f5cIoTwvnhAkFwwD8TIRt5sFgMEfDrBjIj22oILF5xrfaDRr3hc3dLKb7jZUxMWWSCbQZXA5sGE78/UazA8ufEAKPVkWdYi+q39RvR9K2mjrWD1jc6cCrj+ScWCJ+CfWcoVev/QtHqu4WHYfORfinuZUEHLOTIwU/Gz83UdQ1KMvi39wF'
AWS_CREDENTIALS_PROVIDER='org.apache.hadoop.fs.s3a.TemporaryAWSCredentialsProvider'
hadoop distcp -Dfs.s3a.access.key="${AWS_ACCESS_KEY_ID}" -Dfs.s3a.secret.key="${AWS_SECRET_ACCESS_KEY}" -Dfs.s3a.session.token="${AWS_SESSION_TOKEN}" 1.csv s3a://mybucket/temp
hadoop distcp -Dfs.s3a.access.key="${AWS_ACCESS_KEY_ID}" -Dfs.s3a.secret.key="${AWS_SECRET_ACCESS_KEY}" -Dfs.s3a.session.token="${AWS_SESSION_TOKEN}" -Dfs.s3a.aws.credentials.provider="${AWS_CREDENTIALS_PROVIDER}" 1.csv s3a://mybucket/temp
Manu Batham
  • 331
  • 1
  • 14

1 Answers1

1
  1. Session Key support only went in with Hadoop 2.8, so if you are using an earlier version: no joy.
  2. On Hadoop 2.8+ it should work. Try using cloudstore and the hadoop fs commands before worrying about distcp, as that is extra trouble
stevel
  • 12,567
  • 1
  • 39
  • 50