I am running a java app hosted on a tomcat server inside a docker container using AWS EC2 (ECS). Recently the java app service is getting restarted after some time. After checking the ECS agent log it is showing OutOfMemoryError.
Logs:
Level: INFO
Message: DockerGoClient: Process within container 02118442aba47dbe536651b90ca1cd64c43ec419ea949dbe8ca3797f0df1dd71 (name: "my-java-app10-f8e5abadedfbd0c9d901") died due to OOM
Module: docker_client.go
Level: INFO
Message: Handling container change event
Task: 3825f1c6-0d3d-4771-bed7-15332dab4ad9
Container: service-layer
RuntimeID: 02118442aba47dbe536651b90ca1cd64c43ec419ea949dbe8ca3797f0df1dd71
Status: STOPPED
Level: WARN
Message: Error stopping the container; marking it as stopped anyway
Task: 3825f1c6-0d3d-4771-bed7-15332dab4ad9
Container: service-layer
RuntimeID: 02118442aba47dbe536651b90ca1cd64c43ec419ea949dbe8ca3797f0df1dd71
ErrorName: OutOfMemoryError
Error: Container killed due to memory usage
I am not able to figure out which java thread/process is consuming memory. I tried below:
docker inspect <containerName>
docker ps --all (This shows Exited 137))
Checked application logs as well where I could not see any specific calls.
Can anyone give me clue how should I check which java thread/process is running constantly and consuming memory. Note that the max memory limit is 1GB and it was working fine earlier with this.