For the purposes of publishing metrics to AWS CloudWatch I would like to get information of the number of occurrences of some keyword (Eg., Error, Exception) within the last minute (from current system time) in my application logs.
Following are the commands that I have tried so far based on the answers from a related thread ( Filter log file entries based on date range):
awk -vDate=`date -d'now-1 minutes' +["%Y-%m-%d %H:%M:%S"` '($1 FS $2) > Date {print $3}' application.log | grep "ERROR" | uniq -c
awk -vDate=`date -d'now-1 minutes' +["%Y-%m-%d %H:%M:%S"` '{if ($1 > Date) {print $3}}' application.log | grep "ERROR" | uniq -c
awk -vDate=`date -d'now-1 minutes' +["%Y-%m-%d %H:%M:%S"` '{if ($1 == $Date) {print $3}}' application.log | grep "ERROR" | uniq -c
But I get an error like this when I try this:
awk: cmd. line:1: 13:06:17
awk: cmd. line:1: ^ syntax error
Following is the format of my log file:
2016-02-05 12:10:48,761 [INFO] from org.xxx
2016-02-05 12:10:48,761 [INFO] from org.xxx
2016-02-05 12:10:48,763 [INFO] from org.xxx
2016-02-05 12:10:48,763 [INFO] from org.xxx
2016-02-05 12:10:48,763 [ERROR] from org.xxx
2016-02-05 12:10:48,763 [INFO] from org.xxx
2016-02-05 12:10:48,764 [INFO] ffrom org.xxx
2016-02-05 12:10:48,773 [WARN] from org.xxx
2016-02-05 12:10:48,777 [INFO] from org.xxx
2016-02-05 12:10:48,778 [INFO] from org.xxx
Stuck on this for quite a while. Thanks for the help!