Skip to content

Commit

Permalink
tools/c7n_logexporter cloud watch log group exporter (cloud-custodian…
Browse files Browse the repository at this point in the history
  • Loading branch information
kapilt authored May 4, 2017
1 parent b75247e commit 57ad1d1
Show file tree
Hide file tree
Showing 8 changed files with 659 additions and 0 deletions.
45 changes: 45 additions & 0 deletions tools/c7n_logexporter/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@

region=us-east-1
stack_name=custodian-log-exporter
s3_bucket=custodian-log-archive
s3_prefix=build
period=1h

install:
virtualenv .venv
.venv/bin/pip install -r requirements.txt
.venv/bin/pip install awscli boto3 awslogs
.venv/bin/python setup.py develop

package:
cp -Rf c7n_logexporter stage
cp -Rf .venv/lib/python2.7/site-packages/pkg_resources stage
.venv/bin/pip install -r requirements.txt -t stage --no-deps
cp config.yml stage
find stage -name \*.py -delete

clean:
rm -Rf .venv

destroy:
.venv/bin/aws cloudformation delete-stack --stack-name $(stack_name)

deploy: package
AWS_DEFAULT_REGION=$(region) .venv/bin/aws cloudformation package \
--template-file cfn.yml \
--s3-bucket $(s3_bucket) \
--s3-prefix $(s3_prefix) \
--output-template-file stage/built-api.yml
AWS_DEFAULT_REGION=$(region) .venv/bin/aws cloudformation deploy \
--stack-name=$(stack_name) \
--capabilities CAPABILITY_IAM \
--template-file stage/built-api.yml
rm -Rf stage

logs:
awslogs get -w /aws/lambda/$(shell aws cloudformation describe-stacks --stack-name $(stack_name) --query "Stacks[0].Outputs[?OutputKey==\`Function\`].OutputValue | [0]") \
-s$(period)

error-logs:
awslogs get /aws/lambda/$(shell aws cloudformation describe-stacks --stack-name $(stack_name) --query "Stacks[0].Outputs[?OutputKey==\`Function\`].OutputValue | [0]") \
-s$(period) -f Traceback
97 changes: 97 additions & 0 deletions tools/c7n_logexporter/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
# Cloud watch log exporter

A small serverless app to archive cloud logs across accounts to an archive bucket. It utilizes
cloud log export to s3 feature.

*Note* - For most folks, this functionality would be better achieved using a kinesis
stream hooked up to kinesis firehose to archive to s3, which would allow for streaming
archiving.


## Features

- Log group filtering by regex
- Incremental support based on previously synced dates
- Incremental support based on last log group write time
- Cross account via sts role assume
- Lambda and CLI support.
- Day based log segmentation (output keys look
like $prefix/$account_id/$group/$year/$month/$day/$export_task_uuid/$stream/$log)


## Assumptions

- The archive bucket has already has appropriate bucket policy permissions.
See http://goo.gl/ for details.
- Default periodicity for log group archival into s3 is daily.
- Exporter is run with account credentials that have access to the archive s3 bucket.
- Catch up archiving is not run in lambda (do a cli run first)
- Lambda deployment only archives the last day periodically.


# Cli usage

```
make install
```

You can run on a single account / log group via the export subcommand
```
c7n-log-export export --help
```

## Config format

To ease usage when running across multiple accounts, a config file can be specified, as
an example.

```
destination:
bucket: custodian-log-archive
prefix: logs2
accounts:
- name: custodian-demo
role: "arn:aws:iam::111111111111:role/CloudCustodianRole"
groups:
- "/aws/lambda/*"
- "vpc-flow-logs"
```

## Multiple accounts via cli

To run on the cli across multiple accounts, edit the config.yml to specify multiple
accounts and log groups.

```
c7n-log-export run --config config.yml
```

# Serverless Usage

Edit config.yml to specify the accounts, archive bucket, and log groups you want to
use.

```
make install
make deploy
```

# TODO

- [ ] switch to structured logging

- [ ] finer grained periods?

- [ ] inner day runs

- [ ] cloud watch metrics stats on log groups?

- [ ] reason on overlapped dates (ie export till current time, need to pickup remainder of the day)

update current time from the time of the last export, prefix metadata to bucket?

each export task creates a structure under the day, for last write, we annotate to the s3 key.


# SAM issue on tracing as func property
13 changes: 13 additions & 0 deletions tools/c7n_logexporter/c7n_logexporter/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Copyright 2016 Capital One Services, LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
Loading

0 comments on commit 57ad1d1

Please sign in to comment.