Solution – Slowness in getting AWS S3 Buckets Counts and Sizes

Amazon Simple Storage Service (S3) is the bedrock of many cloud infrastructures, offering scalable storage solutions for data of all sizes. Whether it’s storing images, videos, backups, or application data, S3 is ubiquitous. Yet, when attempting to retrieve metrics like the number of objects or the total size of a bucket, users may encounter unexpected delays. Let’s delve into why this happens and how to navigate it.
I was in a same situation where I need to gather metrics for S3 storage, sizes, and counts.
For solutioning it I obviously used the PowerShell and Module is AWSPowershell
I have started gathering buckets by first utilizing Get-S3Buckets

This was easy part but when I started looping thru each bucket using (Get-S3Object) cmdlet to get the number of objects and sizes of each object so that I can sum them up to know the bucket size.
Initially I thought, it was running fine, but it ran for more than 8 hours and after that also there was no sign of getting the full report. (if I had kept it running, it might had ran for days)
This let me think of another approach, to get the metrics from somewhere where these are already being captured.
On researching over the internet, I have found that same can be achieved by utilizing CloudWatch.
I have also found PowerShell function that someone has shared on their blog.

How to find AWS S3 bucket size and number of objects via PowerShell

After modifying this function that uses CloudWatch to get metrics, added it to my existing script that I will share in another post,
I was able to extract the report in 1.5 hrs which I was expecting to take days with my previous approach.
I hope experience shared in this bog post will assist you in saving time and effort in designing your solution for S3 storage report.
 
 
Thanks for reading…
Tech Wizard
 
https://techwizard.cloud
https://syscloudpro.com/
PowerShell Fast Track