3

I am trying to automate Fargate AWS Batch jobs by means of AWS Cloudwatch Events. So far, so good. I am trying to run the same job definition with different configurations. I am able to set the batch job as a cloudwatch event target. I have learned how to use the Constant (JSON text) configuration to set a parameter of the job. Thus, I can set the name parameter successfully and the job runs. However, I am not able to also set the memory and cpu settings in the Cloudwatch event. I would like to use a larger machine for a a bigger port such as Singapore, without changing the job definition. After all, at the moment it still uses the default vpcu and memory settings of the job definition.

{ 
    "Parameters": {"name":"wilhelmshaven"},  
    "ContainerOverrides": {
        "Command": ["upload_to_day.py", "-port_name","Ref::name"],  
        "resourceRequirements": [ 
            {"type": "MEMORY", "value": "4096"},
            {"type": "VCPU", "value": "2"}
        ] 
    }
}

Does any one know how to set the Constant (JSON text) configuration or input transformer correctly?

Edit: If I try the same thing using the AWS CLI, I can achieve what I would like to do.

aws batch submit-job \
        --job-name "run-wilhelmshaven" \
        --job-queue "arn:aws:batch:eu-central-1:123666072061:job-queue/upload-raw-to-day-vtexplorer" \
        --job-definition "arn:aws:batch:eu-central-1:123666072061:job-definition/upload-to-day:2" \
        --container-overrides '{"command": ["upload_to_day.py", "-port_name","wilhelmshaven"], "resourceRequirements": [{"value": "2", "type": "VCPU"}, {"value": "4096", "type": "MEMORY"}]}'
bert wassink
  • 350
  • 3
  • 9

0 Answers0