2

Being a beginner with AWS and S3, i was trying to integrate to upload a Sample.txt file from my local sftp to s3 using apache camel

<route>
        <from uri="sftp://testuser@localhost?password=test&amp;delete=true" />
        <setHeader name="CamelAwsS3Key">
            <constant>test</constant>
        </setHeader>
        <to uri="aws-s3://myTestBucket?accessKey=******&amp;secretKey=RAW(******)&amp;deleteAfterWrite=false&amp;region=AP_SOUTH_1" />
    </route>

This works but the file always uploads with the name test and type is also not shown. Have tried multiple methods. Any suggestions would be helpful.

codeforHarman
  • 115
  • 1
  • 10
  • Well, you set the filename to be `test` (the `CamelAwsS3Key` header). About the type - you mean the Content-Type object property? – gusto2 Sep 01 '21 at 07:41
  • @gusto2 I had added the headerkey cause it showed as a mandatory value for the route. Type, yes. I thought the system will auto identify the type. Is there any alternative i could provide ? – codeforHarman Sep 01 '21 at 07:42
  • 2
    You can set the header using the `simple` expression to the file name (I'm not sure where the sftp store the file name) and you may try to set the `CamelAwsS3ContentType` too http://people.apache.org/~dkulp/camel/aws-s3.html – gusto2 Sep 01 '21 at 07:48

1 Answers1

1

The issue has been resolved when passing a the key and using the filename provided by SFTP simple

<route>
        <from uri="sftp://testuser@localhost?password=test&amp;delete=true" />
        <setHeader name="CamelAwsS3Key">
             <simple>${in.header.camelFileName}</simple>
        </setHeader>
        <to uri="aws-s3://myTestBucket?accessKey=******&amp;secretKey=RAW(******)&amp;deleteAfterWrite=false&amp;region=AP_SOUTH_1" />
    </route>

Thanks to @gusto2 for the support.

Also, as an addition you can add to a specific folder the file on upload by changing ,

<simple>{foldername}/${in.header.camelFileName}</simple>
codeforHarman
  • 115
  • 1
  • 10