Skip to content

Commit

Permalink
feat: Add Waf Source
Browse files Browse the repository at this point in the history
  • Loading branch information
fdmsantos committed Sep 19, 2023
1 parent 7992882 commit 11abc46
Show file tree
Hide file tree
Showing 10 changed files with 231 additions and 11 deletions.
41 changes: 36 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ Supports all destinations and all Kinesis Firehose Features.
* [Kinesis Data Stream](#kinesis-data-stream)
* [Kinesis Data Stream Encrypted](#kinesis-data-stream-encrypted)
* [Direct Put](#direct-put)
* [WAF](waf)
* [Destinations](#destinations)
* [S3](#s3)
* [Redshift](#redshift)
Expand Down Expand Up @@ -46,6 +47,8 @@ Supports all destinations and all Kinesis Firehose Features.
* [Resources](#resources)
* [Inputs](#inputs)
* [Outputs](#outputs)
* [Deprecation](#deprecation)
* [Upgrade](#upgrade)
* [License](#license)

## Module versioning rule
Expand All @@ -60,6 +63,7 @@ Supports all destinations and all Kinesis Firehose Features.
- Sources
- Kinesis Data Stream
- Direct Put
- WAF
- Destinations
- S3
- Data Format Conversion
Expand Down Expand Up @@ -96,14 +100,14 @@ Supports all destinations and all Kinesis Firehose Features.

#### Kinesis Data Stream

**To Enabled it:** `enable_kinesis_source = true`
**To Enabled it:** `input_source = "kinesis"`. The use of variable `enable_kinesis_source` is deprecated and will be removed on next Major Release.

```hcl
module "firehose" {
source = "fdmsantos/kinesis-firehose/aws"
version = "x.x.x"
name = "firehose-delivery-stream"
enable_kinesis_source = true
input_source = "kinesis"
kinesis_source_stream_arn = "<kinesis_stream_arn>"
destination = "s3" # or destination = "extended_s3"
s3_bucket_arn = "<bucket_arn>"
Expand All @@ -114,12 +118,27 @@ module "firehose" {

If Kinesis Data Stream is encrypted, it's necessary pass this info to module .

**To Enabled It:** `kinesis_source_is_encrypted = true`
**To Enabled It:** `input_source = "kinesis"`. The use of variable `enable_kinesis_source` is deprecated and will be removed on next Major Release.

**KMS Key:** use `kinesis_source_kms_arn` variable to indicate the KMS Key to module add permissions to policy to decrypt the Kinesis Data Stream.

#### Direct Put

**To Enabled it:** `input_source = "waf"`.

```hcl
module "firehose" {
source = "fdmsantos/kinesis-firehose/aws"
version = "x.x.x"
name = "firehose-delivery-stream"
input_source = "waf"
destination = "s3" # or destination = "extended_s3"
s3_bucket_arn = "<bucket_arn>"
}
```

#### WAF

```hcl
module "firehose" {
source = "fdmsantos/kinesis-firehose/aws"
Expand Down Expand Up @@ -721,7 +740,8 @@ The destination variable configured in module is mapped to firehose valid destin
## Examples

- [Direct Put](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/s3/direct-put-to-s3) - Creates an encrypted Kinesis firehose stream with Direct Put as source and S3 as destination.
- [Kinesis Data Stream Source](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/s3/kinesis-to-s3-basic) - Creates a basic Kinesis Firehose stream with Kinesis data stream as source and s3 as destination .
- [Kinesis Data Stream Source](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/s3/kinesis-to-s3-basic) - Creates a basic Kinesis Firehose stream with Kinesis data stream as source and s3 as destination.
- [WAF Source](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/s3/waf-to-s3) - Creates a Kinesis Firehose Stream with AWS Web WAF as source and S3 as destination.
- [S3 Destination Complete](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/s3/kinesis-to-s3-complete) - Creates a Kinesis Firehose Stream with all features enabled.
- [Redshift](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/redshift/direct-put-to-redshift) - Creates a Kinesis Firehose Stream with redshift as destination.
- [Redshift In VPC](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/redshift/redshift-in-vpc) - Creates a Kinesis Firehose Stream with redshift in VPC as destination.
Expand Down Expand Up @@ -891,7 +911,7 @@ No modules.
| <a name="input_enable_data_format_conversion"></a> [enable\_data\_format\_conversion](#input\_enable\_data\_format\_conversion) | Set it to true if you want to disable format conversion. | `bool` | `false` | no |
| <a name="input_enable_destination_log"></a> [enable\_destination\_log](#input\_enable\_destination\_log) | The CloudWatch Logging Options for the delivery stream | `bool` | `true` | no |
| <a name="input_enable_dynamic_partitioning"></a> [enable\_dynamic\_partitioning](#input\_enable\_dynamic\_partitioning) | Enables or disables dynamic partitioning | `bool` | `false` | no |
| <a name="input_enable_kinesis_source"></a> [enable\_kinesis\_source](#input\_enable\_kinesis\_source) | Set it to true to use kinesis data stream as source | `bool` | `false` | no |
| <a name="input_enable_kinesis_source"></a> [enable\_kinesis\_source](#input\_enable\_kinesis\_source) | DEPRECATED: Use instead `input_source = "kinesis"` | `bool` | `false` | no |
| <a name="input_enable_lambda_transform"></a> [enable\_lambda\_transform](#input\_enable\_lambda\_transform) | Set it to true to enable data transformation with lambda | `bool` | `false` | no |
| <a name="input_enable_s3_backup"></a> [enable\_s3\_backup](#input\_enable\_s3\_backup) | The Amazon S3 backup mode | `bool` | `false` | no |
| <a name="input_enable_s3_encryption"></a> [enable\_s3\_encryption](#input\_enable\_s3\_encryption) | Indicates if want use encryption in S3 bucket. | `bool` | `false` | no |
Expand All @@ -906,6 +926,7 @@ No modules.
| <a name="input_http_endpoint_request_configuration_content_encoding"></a> [http\_endpoint\_request\_configuration\_content\_encoding](#input\_http\_endpoint\_request\_configuration\_content\_encoding) | Kinesis Data Firehose uses the content encoding to compress the body of a request before sending the request to the destination | `string` | `"GZIP"` | no |
| <a name="input_http_endpoint_retry_duration"></a> [http\_endpoint\_retry\_duration](#input\_http\_endpoint\_retry\_duration) | Total amount of seconds Firehose spends on retries. This duration starts after the initial attempt fails, It does not include the time periods during which Firehose waits for acknowledgment from the specified destination after each attempt | `number` | `300` | no |
| <a name="input_http_endpoint_url"></a> [http\_endpoint\_url](#input\_http\_endpoint\_url) | The HTTP endpoint URL to which Kinesis Firehose sends your data | `string` | `null` | no |
| <a name="input_input_source"></a> [input\_source](#input\_input\_source) | This is the kinesis firehose source | `string` | `"direct-put"` | no |
| <a name="input_kinesis_source_is_encrypted"></a> [kinesis\_source\_is\_encrypted](#input\_kinesis\_source\_is\_encrypted) | Indicates if Kinesis data stream source is encrypted | `bool` | `false` | no |
| <a name="input_kinesis_source_kms_arn"></a> [kinesis\_source\_kms\_arn](#input\_kinesis\_source\_kms\_arn) | Kinesis Source KMS Key to add Firehose role to decrypt the records | `string` | `null` | no |
| <a name="input_kinesis_source_role_arn"></a> [kinesis\_source\_role\_arn](#input\_kinesis\_source\_role\_arn) | The ARN of the role that provides access to the source Kinesis stream | `string` | `null` | no |
Expand Down Expand Up @@ -1010,6 +1031,16 @@ No modules.
| <a name="output_s3_cross_account_bucket_policy"></a> [s3\_cross\_account\_bucket\_policy](#output\_s3\_cross\_account\_bucket\_policy) | Bucket Policy to S3 Bucket Destination when the bucket belongs to another account |
<!-- END OF PRE-COMMIT-TERRAFORM DOCS HOOK -->

## Deprecation

### Version >= 2.1.0

* variable `enable_kinesis_source` is deprecated. Use instead `input_source = "kinesis"`.

## Upgrade

- Version 1.x to 2.x Upgrade Guide [here](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/blob/main/UPGRADE-2.0.md)

## License

Apache 2 Licensed. See [LICENSE](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/LICENSE) for full details.
66 changes: 66 additions & 0 deletions examples/s3/waf-to-s3/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
# Kinesis Firehose: Kinesis Data Source To S3

Basic Configuration in this directory creates kinesis firehose stream with Kinesis Data Stream as source and S3 bucket as destination with a basic configuration.

## Usage

To run this example you need to execute:

```bash
$ terraform init
$ terraform plan
$ terraform apply
```

Note that this example may create resources which cost money. Run `terraform destroy` when you don't need these resources.

Can use the following command to send records to Kinesis Data Stream.

```shell
aws kinesis put-record \
--stream-name $(terraform output -json | jq -r .kinesis_data_stream_name.value) \
--cli-binary-format raw-in-base64-out \
--data '{"user_id":"user1", "score": 100}' \
--partition-key 1
```

<!-- BEGINNING OF PRE-COMMIT-TERRAFORM DOCS HOOK -->
## Requirements

| Name | Version |
|------|---------|
| <a name="requirement_terraform"></a> [terraform](#requirement\_terraform) | >= 0.13.1 |
| <a name="requirement_aws"></a> [aws](#requirement\_aws) | ~> 5.0 |
| <a name="requirement_random"></a> [random](#requirement\_random) | ~> 3.0 |

## Providers

| Name | Version |
|------|---------|
| <a name="provider_aws"></a> [aws](#provider\_aws) | ~> 5.0 |
| <a name="provider_random"></a> [random](#provider\_random) | ~> 3.0 |

## Modules

| Name | Source | Version |
|------|--------|---------|
| <a name="module_firehose"></a> [firehose](#module\_firehose) | ../../../ | n/a |
| <a name="module_waf"></a> [waf](#module\_waf) | cloudposse/waf/aws | 1.2.0 |

## Resources

| Name | Type |
|------|------|
| [aws_s3_bucket.s3](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/s3_bucket) | resource |
| [random_pet.this](https://registry.terraform.io/providers/hashicorp/random/latest/docs/resources/pet) | resource |

## Inputs

| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| <a name="input_name_prefix"></a> [name\_prefix](#input\_name\_prefix) | Name prefix to use in resources | `string` | `"waf-to-s3"` | no |

## Outputs

No outputs.
<!-- END OF PRE-COMMIT-TERRAFORM DOCS HOOK -->
67 changes: 67 additions & 0 deletions examples/s3/waf-to-s3/main.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
resource "random_pet" "this" {
length = 2
}

resource "aws_s3_bucket" "s3" {
bucket = "${var.name_prefix}-destination-bucket-${random_pet.this.id}"
force_destroy = true
}

module "waf" {
source = "cloudposse/waf/aws"
version = "1.2.0"

log_destination_configs = [module.firehose.kinesis_firehose_arn]

visibility_config = {
cloudwatch_metrics_enabled = false
metric_name = "rules-example-metric"
sampled_requests_enabled = false
}

managed_rule_group_statement_rules = [
{
name = "AWS-AWSManagedRulesAdminProtectionRuleSet"
priority = 1

statement = {
name = "AWSManagedRulesAdminProtectionRuleSet"
vendor_name = "AWS"
}

visibility_config = {
cloudwatch_metrics_enabled = true
sampled_requests_enabled = true
metric_name = "AWS-AWSManagedRulesAdminProtectionRuleSet"
}
}
]

context = {
enabled = true
namespace = "test"
tenant = null
environment = null
stage = null
name = null
delimiter = null
attributes = []
tags = {}
additional_tag_map = {}
regex_replace_chars = null
label_order = []
id_length_limit = null
label_key_case = null
label_value_case = null
descriptor_formats = {}
labels_as_tags = ["unset"]
}
}

module "firehose" {
source = "../../../"
name = "${var.name_prefix}-delivery-stream"
input_source = "waf"
destination = "s3"
s3_bucket_arn = aws_s3_bucket.s3.arn
}
24 changes: 24 additions & 0 deletions examples/s3/waf-to-s3/outputs.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
#output "kinesis_firehose_arn" {
# description = "The ARN of the Kinesis Firehose Stream"
# value = module.firehose.kinesis_firehose_arn
#}
#
#output "kinesis_data_stream_name" {
# description = "The name of the Kinesis Firehose Stream"
# value = module.firehose.kinesis_firehose_name
#}
#
#output "kinesis_firehose_destination_id" {
# description = "The Destination id of the Kinesis Firehose Stream"
# value = module.firehose.kinesis_firehose_destination_id
#}
#
#output "kinesis_firehose_version_id" {
# description = "The Version id of the Kinesis Firehose Stream"
# value = module.firehose.kinesis_firehose_version_id
#}
#
#output "kinesis_firehose_role_arn" {
# description = "The ARN of the IAM role created for Kinesis Firehose Stream"
# value = module.firehose.kinesis_firehose_role_arn
#}
5 changes: 5 additions & 0 deletions examples/s3/waf-to-s3/variables.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
variable "name_prefix" {
description = "Name prefix to use in resources"
type = string
default = "waf-to-s3"
}
14 changes: 14 additions & 0 deletions examples/s3/waf-to-s3/versions.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
terraform {
required_version = ">= 0.13.1"

required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
random = {
source = "hashicorp/random"
version = "~> 3.0"
}
}
}
2 changes: 1 addition & 1 deletion iam.tf
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ locals {
application_role_name = var.create_application_role ? coalesce(var.application_role_name, "${var.name}-application-role", "*") : null
create_application_role_policy = var.create && var.create_application_role_policy
add_backup_policies = local.enable_s3_backup && var.s3_backup_use_existing_role
add_kinesis_source_policy = var.create && var.create_role && var.enable_kinesis_source && var.kinesis_source_use_existing_role
add_kinesis_source_policy = var.create && var.create_role && local.is_kinesis_source && var.kinesis_source_use_existing_role
add_lambda_policy = var.create && var.create_role && var.enable_lambda_transform
add_s3_kms_policy = var.create && var.create_role && ((local.add_backup_policies && var.s3_backup_enable_encryption) || var.enable_s3_encryption)
add_glue_policy = var.create && var.create_role && var.enable_data_format_conversion && var.data_format_conversion_glue_use_existing_role
Expand Down
5 changes: 4 additions & 1 deletion locals.tf
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,9 @@ locals {
cw_log_group_name = "/aws/kinesisfirehose/${var.name}"
cw_log_delivery_stream_name = "DestinationDelivery"
cw_log_backup_stream_name = "BackupDelivery"
source = var.enable_kinesis_source ? "kinesis" : var.input_source # TODO: This should be removed when delete enable_kinesis_source variable (Next Major Version)
is_kinesis_source = local.source == "kinesis" ? true : false
is_waf_source = local.source == "waf" ? true : false
destinations = {
s3 : "extended_s3",
extended_s3 : "extended_s3",
Expand Down Expand Up @@ -138,7 +141,7 @@ locals {
s3_backup_mode = local.use_backup_vars_in_s3_configuration ? local.backup_modes[local.destination][var.s3_backup_mode] : null

# Kinesis source Stream
kinesis_source_stream_role = (var.enable_kinesis_source ? (
kinesis_source_stream_role = (local.is_kinesis_source ? (
var.kinesis_source_use_existing_role ? local.firehose_role_arn : var.kinesis_source_role_arn
) : null)

Expand Down
6 changes: 3 additions & 3 deletions main.tf
Original file line number Diff line number Diff line change
Expand Up @@ -9,19 +9,19 @@ data "aws_subnet" "elasticsearch" {

resource "aws_kinesis_firehose_delivery_stream" "this" {
count = var.create ? 1 : 0
name = var.name
name = local.is_waf_source ? "aws-waf-logs-${var.name}" : var.name
destination = local.destination

dynamic "kinesis_source_configuration" {
for_each = var.enable_kinesis_source ? [1] : []
for_each = local.is_kinesis_source ? [1] : []
content {
kinesis_stream_arn = var.kinesis_source_stream_arn
role_arn = local.kinesis_source_stream_role
}
}

dynamic "server_side_encryption" {
for_each = !var.enable_kinesis_source && var.enable_sse ? [1] : []
for_each = !local.is_kinesis_source && var.enable_sse ? [1] : []
content {
enabled = var.enable_sse
key_arn = var.sse_kms_key_arn
Expand Down
12 changes: 11 additions & 1 deletion variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,16 @@ variable "name" {
type = string
}

variable "input_source" {
description = "This is the kinesis firehose source"
type = string
default = "direct-put"
validation {
error_message = "Please use a valid source!"
condition = contains(["direct-put", "kinesis", "waf"], var.input_source)
}
}

variable "destination" {
description = "This is the destination to where the data is delivered"
type = string
Expand Down Expand Up @@ -346,7 +356,7 @@ variable "sse_kms_key_arn" {
}

variable "enable_kinesis_source" {
description = "Set it to true to use kinesis data stream as source"
description = "DEPRECATED: Use instead `input_source = \"kinesis\"`"
type = bool
default = false
}
Expand Down

0 comments on commit 11abc46

Please sign in to comment.