Skip to content

Commit

Permalink
docs: Show the max savings of compressing JSON files
Browse files Browse the repository at this point in the history
  • Loading branch information
l0b0 committed Oct 3, 2024
1 parent ad88437 commit 5f9fa90
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion docs/GeoJSON-compression.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,6 @@ Contra compression:
- [AWS CLI issue](https://github.com/aws/aws-cli/issues/6765)
- [boto3 issue](https://github.com/boto/botocore/issues/1255)
- Any files on S3 "[smaller than 128 KB](https://aws.amazon.com/s3/pricing/)" (presumably actually 128 KiB) are treated as being 128 KB for pricing purposes, so there would be no price gain from compressing any files which are smaller than this
- The extra development time to deal with compressing and decompressing would probably not offset the savings
- The extra development time to deal with compressing and decompressing JSON files larger than 128 KB would not offset the savings:
- We can get the sizes of JSON files by running `aws s3api list-objects-v2 --bucket=nz-elevation --no-sign-request --query="Contents[?ends_with(Key, 'json')].Size"` and `aws s3api list-objects-v2 --bucket=nz-imagery --no-sign-request --query="Contents[?ends_with(Key, 'json')].Size"`
- Summing up the sizes of files larger than 128 KB we get a total of only _33 MB_ at time of writing

0 comments on commit 5f9fa90

Please sign in to comment.