My program creates folders on "~/" if the OS is linux (Which is the case, considering the ECS Instance):
if platform == "linux" or platform == "linux2":
appdata = "~/"
else:
appdata = os.getenv("APPDATA")
log_path = appdata + f"/concil/{parsed_args.acquirer}/logs"
cache_path = appdata + f"/concil/{parsed_args.acquirer}/cache"
pathlib.Path(log_path).mkdir(parents=True, exist_ok=True)
When I connect to the ECS Instance via SSH, even though the Task Definition is running, I can't find any files (And the software should create them normally). What I find weird is that on my Dockerfile:
FROM python:3
COPY requirements.txt /tmp
WORKDIR /tmp
RUN pip install -r requirements.txt --default-timeout=100
COPY . /app
RUN make /app
WORKDIR /app
ENV AWS_ACCESS_KEY_ID=??????????????????
ENV AWS_SECRET_ACCESS_KEY=????????????????????????????????????
ENV AWS_DEFAULT_REGION=us-west-1
CMD [ "python", "./InputDataController/main.py", "--acquirer", "adyen", "-all" ]
I copy the contents to an "/app" folder, which is completely non-existent on the ECS Instance. I can't see any docker-related logs, I can't see the output of my program, yet it is running normally. How can I check the files being saved on the container?