Exporting AWS Cloudwatch log streams to a local file


I love AWS, but when I’m debugging issues I prefer the Linux command line over Cloudwatch Logs Insights. Numerous AWS services store their log configuration inside cloudwatch, which presents a small challenge since my tooling (ripgrep, awk, jq, sed, etc.) can’t directly access cloudwatch logs. The aws command line has a nifty get-log-events option which can solve this problem. It allows you to export logs from a log stream, and has several options to control what gets exported.

The following example shows how to export logs from the stream named LOG_STREAM_NAME, but only events that occurred in the past five minutes:

$ aws --profile la --region us-east-1 logs get-log-events --log-group-name LOG_GROUP_NAME --log-stream-name LOG_STREAM_NAME --start-time $(date "+%s%N" -d "5 minutes ago" | cut -b1-13) > /tmp/log

The export will contain one or more JSON objects similar to the following:

{
    "events": [
        {
            "timestamp": 1580312457000,
            "message": "I0129 15:40:57.094248       1 controller.go:107] OpenAPI AggregationController: Processing item v1beta1.metrics.k8s.io",
            "ingestionTime": 1580312464749
        }
}

Get-log-events also has a “create-export-task” option, which can be used to export logs to an S3 bucket. This can be useful for archiving logs for offline debugging. The aws CLI is super useful, and the more I work with it the more I like it!

This article was posted by on 2020-01-29 11:51:31 -0500 -0500