24

I am trying to parse log entries which are a mix of text and JSON. The first line is text representation and the next lines are JSON payload of the event. One of the possible examples are:

2016-07-24T21:08:07.888Z [INFO] Command completed lessonrecords-create
{
  "key": "lessonrecords-create",
  "correlationId": "c1c07081-3f67-4ab3-a5e2-1b3a16c87961",
  "result": {
    "id": "9457ce88-4e6f-4084-bbea-14fff78ce5b6",
    "status": "NA",
    "private": false,
    "note": "Test note",
    "time": "2016-02-01T01:24:00.000Z",
    "updatedAt": "2016-07-24T21:08:07.879Z",
    "createdAt": "2016-07-24T21:08:07.879Z",
    "authorId": null,
    "lessonId": null,
    "groupId": null
  }
}

For these records I try to define Log Metric Filter to a) match records b) select data or dimensions if possible.

According to the AWS docs JSON pattern should look like this:

{ $.key = "lessonrecords-create" }

however, it does not match anything. My guess is that because of mix text and JSON in a single log entry.

So, the questions are: 1. Is it possible to define a pattern that will match this log format? 2. Is it possible to extract dimensions, values from such a log format? 3. Help me with a pattern to do this.

Nmk
  • 1,281
  • 2
  • 14
  • 25
Mike Chaliy
  • 25,801
  • 18
  • 67
  • 105

6 Answers6

1

If you set up the metric filter in the way that you have defined, the test will not register any matches (I have also had this issue), however when you deploy the metric filter it will still register matches (at least mine did). Just keep in mind that there is no way (as far as I am aware) to run this metric filter BACKWARDS (ie. it will only capture data from when it is created). [If you're trying to get stats on past data, you're better off using log insight queries]

I am currently experimenting with different parse statements to try and extract data (its also a mix of JSON and text), this thread MAY help you (it didn't for me) Amazon Cloudwatch Logs Insights with JSON fields .

UPDATE! I have found a way to parse the text but its a little bit clunky. If you export your cloudwatch logs using a lamda function to SumoLogic, their search tool allows for MUCH better log manipulation and lets you parse JSON fields (if you treat the entire entry as text). SumoLogic is also really helpful because you can just extract your search results as a CSV. For my purposes, I parse the entire log message in SumoLogic, extract all the logs as a CSV and then I used regex in Python to filter through and extract the values I need.

David A
  • 19
  • 3
1

Let's say you have the following log

2021-09-29 15:51:18,624 [main] DEBUG com.company.app.SparkResources - AUDIT : {"user":"Raspoutine","method":"GET","pathInfo":"/analysis/123"}

you can parse it like this to be able to handle the part after "AUDIT : " as a JSON

fields @message
| parse @message "* [*] * * - AUDIT : *" as timestamp, thread, logLevel, clazz, msg
| filter ispresent(msg)
| filter method = "GET" # You can use fields which are contained in the JSON String of 'msg' field. Do not use 'msg.method' but directly 'method'

The fields contained in your isolated / parsed JSON field are automatically added as fields usable in the query

Comencau
  • 1,084
  • 15
  • 35
0

You can use CloudWatch Events for such purpose(aka Subscription Filters), what you will need to do is define a cloudwatch Rule which uses an expression statement to match your logs. Here, I will let you do all the reading:

https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/SubscriptionFilters.html

https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/Create-CloudWatch-Events-Scheduled-Rule.html

:)

Aye_baybae
  • 74
  • 3
0

Split the message into 3 fields and the 3rd field will be a valid json . I think in your case it would be

fields @timestamp, @message 
| parse @message '[] * {"*"}' as field1, field2, field3
| limit 50

field3 is the valid json. [INFO} will be the first field.

vsingh
  • 6,365
  • 3
  • 53
  • 57
0

You can search JSON string representation, which is not as powerful.

For your example,

instead of { $.key = "lessonrecords-create" }

try "\"key\":\"lessonrecords-create\"".

This filter is not semantically identical to your requirement, though. It will also give events where key is not at the root of json.

Shashank Kapoor
  • 1,101
  • 8
  • 9
-2

you can use fluentd agent to send logs to Cloudwatch. Create custom grok pattern based on your metric filter.

Steps:

  • Install fluentd agent in your server
  • Install fluent-plugin-cloudwatch-logs plugin and fluent-plugin-grok-parser plugin
  • write your custom grok pattern based on your log format

    please refer this blog for more information

Suganya G
  • 89
  • 6