0

I'm using AWS CloudWatch PutLogEvents to store client side logs into the CloudWatch and I'm using this reference

https://docs.aws.amazon.com/AmazonCloudWatchLogs/latest/APIReference/API_PutLogEvents.html

Below is the sample implementation that I'm using for the PutLogEvents and I'm using a foreach to push the list of InputLogEvents into the CloudWatch.

https://docs.aws.amazon.com/code-samples/latest/catalog/javav2-cloudwatch-src-main-java-com-example-cloudwatch-PutLogEvents.java.html

After continuously pushing logs for about one or two hours the service gets crashed and gets out of memory error. But when I run the application locally it won't get crashed and run more than one or two hours. But I haven't continuously run it on locally to check the crashing time. I was wondering is there any memory leak issue with AWS CloudWatch implementation. Because before the implementation the service works for months without any heap issues.

Here is my service implementation and through the controller, I'm calling this putLogEvents method. My only suspect area is the cwLogDto for each loop and in there I'm assigning the inputLogEvent as null. So it should be garbage collected. Anyway, I have tested this without sending the list of dtos and still, I'm getting the same OOM error.

@Override
public TransportDto putLogEvents(List<CWLogDto> cwLogDtos, UserType userType) throws Exception {
    TransportDto transportDto = new TransportDto();
    CloudWatchLogsClient logsClient = CloudWatchLogsClient.builder().region(Region.of(region))
            .build();
    PutLogEventsResponse putLogEventsResponse = putCWLogEvents(logsClient, userType, cwLogDtos);

    logsClient.close();
    transportDto.setResponse(putLogEventsResponse.sdkHttpResponse());
    return transportDto;
}

private PutLogEventsResponse putCWLogEvents(CloudWatchLogsClient logsClient, UserType userType, List<CWLogDto> cwLogDtos) throws Exception{

    DateTimeFormatter formatter = DateTimeFormatter.ofPattern(logStreamPattern);
    String streamName = LocalDateTime.now().format(formatter);
    String logGroupName = logGroupOne;
    if(userType.equals(UserType.TWO))
        logGroupName =logGroupTwo;

    log.info("Total Memory before (in bytes): {}" , Runtime.getRuntime().totalMemory());
    log.info("Free Memory before (in bytes): {}" , Runtime.getRuntime().freeMemory());
    log.info("Max Memory before (in bytes): {}" , Runtime.getRuntime().maxMemory());

    DescribeLogStreamsRequest logStreamRequest = DescribeLogStreamsRequest.builder()
            .logGroupName(logGroupName)
            .logStreamNamePrefix(streamName)
            .build();
    DescribeLogStreamsResponse describeLogStreamsResponse = logsClient.describeLogStreams(logStreamRequest);

    // Assume that a single stream is returned since a specific stream name was specified in the previous request. if not will create a new stream
    String sequenceToken = null;
    if(!describeLogStreamsResponse.logStreams().isEmpty()){
        sequenceToken = describeLogStreamsResponse.logStreams().get(0).uploadSequenceToken();
        describeLogStreamsResponse = null;
    }
    else{
        CreateLogStreamRequest request = CreateLogStreamRequest.builder()
                .logGroupName(logGroupName)
                .logStreamName(streamName)
                .build();
        logsClient.createLogStream(request);
        request = null;
    }

    // Build an input log message to put to CloudWatch.
    List<InputLogEvent> inputLogEventList = new ArrayList<>();
    for (CWLogDto cwLogDto : cwLogDtos) {
        InputLogEvent inputLogEvent = InputLogEvent.builder()
                .message(new ObjectMapper().writeValueAsString(cwLogDto))
                .timestamp(System.currentTimeMillis())
                .build();
        inputLogEventList.add(inputLogEvent);
        inputLogEvent = null;
    }

    log.info("Total Memory after (in bytes): {}" , Runtime.getRuntime().totalMemory());
    log.info("Free Memory after (in bytes): {}" , Runtime.getRuntime().freeMemory());
    log.info("Max Memory after (in bytes): {}" , Runtime.getRuntime().maxMemory());


    // Specify the request parameters.
    // Sequence token is required so that the log can be written to the
    // latest location in the stream.
    PutLogEventsRequest putLogEventsRequest = PutLogEventsRequest.builder()
            .logEvents(inputLogEventList)
            .logGroupName(logGroupName)
            .logStreamName(streamName)
            .sequenceToken(sequenceToken)
            .build();
    inputLogEventList = null;
    logStreamRequest = null;

    return logsClient.putLogEvents(putLogEventsRequest);
}

CWLogDto

@Data
@JsonInclude(JsonInclude.Include.NON_NULL)
public class CWLogDto {
   private Long userId;
   private UserType userType;
   private LocalDateTime timeStamp;
   private Integer logLevel;
   private String logType;
   private String source;
   private String message;
}

Heap dump summary

Heap dump summary

enter image description here

Any help will be greatly appreciated.

Adek
  • 43
  • 2
  • 9
  • Can you attach a memory profiler? – Thorbjørn Ravn Andersen Jan 27 '22 at 22:03
  • Hi @ThorbjørnRavnAndersen Heap dump added – Adek Jan 29 '22 at 10:02
  • You store a lot of hashmaps somewhere causing you to run out of memory. Investigate what they are and where they are stored - you most likely forgot to tell something that you were done using it. – Thorbjørn Ravn Andersen Jan 29 '22 at 10:34
  • 1
    You might find https://stackoverflow.com/questions/6470651/how-can-i-create-a-memory-leak-in-java?rq=1 interesting. – Thorbjørn Ravn Andersen Jan 29 '22 at 11:13
  • Hi @ThorbjørnRavnAndersen I updated the question with the code. And I'm not using any hashmap implementation in this code base. – Adek Jan 29 '22 at 13:37
  • Then something you use do. Time to learn to figure out how to use the memory profiler to identify rogue objects. – Thorbjørn Ravn Andersen Jan 29 '22 at 14:06
  • Hi @ThorbjørnRavnAndersen I don't think so. Because I have run the application without this CW implementation and the dump was decent. I created a very basic implementation as in their documentation. And I'm continuously calling that method to push logs into the CW side. isn't it something to do with CW internal implementation. In my heap dump, Hashmap$Node[]s are taking lots of memory and I see lots of aws connection related references. Updated the Question – Adek Jan 31 '22 at 14:53
  • Find out where the memory goes. Then you have a good bet at why. Good luck. – Thorbjørn Ravn Andersen Jan 31 '22 at 15:15
  • I was able to fix this issue. The issue was with the logsClient and I haven't Autowired the logsClient therefore it's getting called every time I invoke the method and that leads us to the high memory utilization. Better to implement multi threading approach as well. Will add the full implementation as an answer. – Adek Mar 13 '22 at 14:49
  • so you HAD a memory leak. If you did not use a profiler to find this out, I would suggest using your current situation to _learn_ to use a profiler as it will save you time next time you have a similar situation. – Thorbjørn Ravn Andersen Mar 13 '22 at 16:29

0 Answers0