0

I have an IOS project using Amplify as a backend. I have also incorporated Amplify Video in the hope of supporting video-on-demand. After adding Amplify Video to the project, an "Input" and "Output" bucket is generated. These appear outside of my project environment when visualised via the Amplify Console. They can only be accessed via navigating to AWS S3 console. My questions is, how to I upload my videos via swift to the "Input" bucket via Amplify (or do I not)? The code I have below uploads the video to the S3 bucket within the project environment. There is next to no support for Amplify Video for IOS (Amplify Video Documentation)

if let vidData = self.convertVideoToData(from: srcURL){
                let key = "myKey"
                //let options = StorageUploadDataRequest.Options.init(accessLevel: .protected)
                Amplify.Storage.uploadData(key: key, data: vidData) { (progress) in
                    print(progress.fractionCompleted)
                } resultListener: { (result) in
                    switch result{
                    case .success(_ ):
                        print("upload success!")
                    case .failure(let error):
                        print(error.errorDescription)
                    }
                }
            }
johnDoe
  • 709
  • 11
  • 29

1 Answers1

0

I'm facing the same issue.. As far as I can tell the iOS Amplify library's amplifyconfiguration.json is limited to using one storage spec under S3TransferUtility.

I'm in the process of solving this issue myself, but the quick solution is to modify the created AWS video resources to run off the same bucket (input and output). Now, be warned I'm an iOS Engineer, not backend, only getting familiar with AWS.

Solution as follows:

  • The input bucket the amplify video plugin created has 4 event notifications under the properties tab. These each kick off a VOD-inputWatcher lambda function. Copy these 4 notifications to your original bucket
  • The output bucket has two event notifications, copy those also to the original bucket

Try the process now, drop a video into your bucket manually. It will fail but we'll see progress - the MediaConvert job is kicked off, but will tell you it failed because it didn't have permissions to read the files in your bucket. Something like Unable to open input file, Access Denied. Let's solved this:

  • Go to the input lambda function and add this function:
async function enableACL(eventObject) {
  console.log(eventObject);
    const objectKey = eventObject.object.key;
    const bucketName = eventObject.bucket.name;
    const params = {
      Bucket: bucketName,
      Key: objectKey,
      ACL: 'public-read',
    };
    console.log(`params: ${eventObject}`);
    s3.putObjectAcl(params, (err, data) => {
      if (err) {
        console.log("failed to set ACL");
        console.log(err);
      } else {
        console.log("successfully set acl");
        console.log(data);
      }
    });
}

Now call it from the event handler, and don't forget to add const s3 = new AWS.S3({}); on top of the file:

exports.handler = async (event) => {
  
  // Set the region
  AWS.config.update({ region: event.awsRegion });
  console.log(event);
  if (event.Records[0].eventName.includes('ObjectCreated')) {
    await enableACL(event.Records[0].s3);
    await createJob(event.Records[0].s3);
    const response = {
      statusCode: 200,
      body: JSON.stringify(`Transcoding your file: ${event.Records[0].s3.object.key}`),
    };
    return response;
  }
};

Try the process again. The lambda will fail, you can see it in the lambda's CloutWatch: failed to set ACL. INFO AccessDenied: Access Denied at Request.extractError. To fix this we need to give S3 permissions to the input lambda function.

Do that by navigating to the lambda function's Configuration / Permissions and find the Role. Open it in IAM and add Full S3 access. Not ideal, but again, I'm just trying to make this work. Probably would be better to specify the exact Bucket and correct actions only. Any help regarding proper roles greatly appreciated :)

Repeat the same for the output lambda function's role also, give it the right S3 permissions.

Try uploading a file again. At this point if you run into this error: failed to set ACL. INFO NoSuchKey: The specified key does not exist. at Request.extractError. It's because in the bucket you have objects in the protected Folder. Try to use the public folder instead (in the iOS lib you'll have to use StorageAccessLevel.guest permissions to access this)

Now drop a file in the public folder. You should see the MediaConvert job kick off again. It will still fail (check in MediaConvert / Jobs), saying it doesn't have permissions to write to the S3 bucket Unable to write to output file .. . You can fix this by going to the input lambda function again, this gives the permissions to the MediaConvert job:

const jobParams = {
    JobTemplate: process.env.ARN_TEMPLATE,
    Queue: queueARN,
    UserMetadata: {},
    Role: process.env.MC_ROLE,
    Settings: jobSettings,
  };
  await mcClient.createJob(jobParams).promise();

Go to the input lambda function, Configuration / Environment Variables. The function uses the field MC_ROLE to provide the role name to the Media Convert job. Copy the role name and look it up in IAM. Modify its permissions by adding the right S3 access to the role to your bucket.

If you try it only more time, the output should appear right next to your input file.

In order to be able to read the s3://public/{userIdentityId}/{videoName}/{videoName}{quality}..m3u8 file using the current Amplify.Storage.downloadFile(key: {key}, ...) function in iOS, you'll probably have to attach to the key right path and remove the .mp4 extension. Let me know if you're facing any problems, I'm sorting this out now also.

Dharman
  • 30,962
  • 25
  • 85
  • 135