How to Use S3 Access Logs
• Learn how to obtain S3 Access logs
• Learn how to make S3 Access logs readable for analysis
Amazon Simple Storage Service, aka S3, is one of the most popular and user-friendly services on AWS. S3 allows file storage with scalability, and it also supports access auditing through S3 access logs, helping you avoid performance and security issues. It’s worth mentioning that many resources are available that enable S3 to be used in different ways, such as versioning, website hosting, and logs for auditing and security visibility. This tutorial will show you how to use S3 access logs via code as part of your broader telemetry pipeline.
AWS S3 Access Log Content
AWS S3 provides log information regarding access to buckets and their objects. These s3 logs capture data like: bucket owner, bucket, time, remote IP, requester, operation, request ID, requestURI, key, error code, bytes sent, HTTP status, total time, object size, turnaround time, user agent, referrer, host ID, version ID, cipher suite, signature version, authentication type, TLS version, and host header.
When access to an object is requested, users can use this data to identify the origin of the request. You can check if unauthorized users have accessed any resources or spot an object with an unusually high number of downloads. You can also assess whether file retrieval times meet your app’s expectations. In addition, this security log data helps illustrate how applications interact with S3 resources over time.
We’ll show you how to obtain and format these S3 access logs for analysis in the sections below.
Enabling Logging for Bucket Objects
To enable s3 logs, first create one bucket for your files (objects) and another bucket to store the access logs—both should be in the same AWS region. It’s best practice to avoid storing logs in the same bucket you’re monitoring. If an issue occurs with the main bucket, the logs may also be compromised, making it harder to diagnose the error.

After creating the buckets, navigate to the Properties of the bucket that stores your files. On the Properties page, click the Edit button in the Server access logging section. Enable logging, then click Browse S3 to select the log destination bucket.

In the modal, select the appropriate bucket and click Choose path. Back in the form, click Save changes. You’re now saving S3 access logs automatically for object usage tracking.

Testing Logging
To verify the setup, upload or delete files in your monitored bucket. These actions will generate s3 logs. You can open, download, and remove files to trigger entries. Then, visit the bucket where your logs are stored and wait a few minutes for the data to appear.
Once logs are generated, you’ll see entries related to each interaction, including object retrieval, deletion, and policy versioning. These entries act as security logs you can later parse and analyze.

Log Data Samples
Let’s take a look at some sample s3 logs to understand their structure and content. For example, here’s a delete operation log:
And this is get:
These logs show rich metadata about actions performed, user agents involved, and AWS request types. Understanding this format is key to building any automated S3 log parser or audit trail.
Download Logs
To process S3 access logs, you'll need to download them locally. AWS SDK for S3 provides methods for retrieving objects programmatically. Here, we’ll use AWS SDK for Ruby.
AWS SDK S3 for Ruby can be installed via gem:
Once you have it installed, you can either use IRB code or run Ruby code. Choose your preferred method, and don't forget to get your credentials from IAM. You will need an access key ID and a secret access key. Now, create a new folder called logs to store the logs locally.
Now all your s3 logs will be downloaded and saved inside your logs folder. Consider implementing log rotation practices if you're dealing with high log volume to keep storage usage efficient and reduce clutter.
Exporting S3 Access Logs to JSON
By default, S3 access logs are not in a structured format like JSON. You’ll need to wrap log content in your own structure for readability and downstream processing.
The following code will read each downloaded log, select key, remote IP, and operation storing in a list that will be exported as a JSON file.
Conclusion
Reading AWS S3 logs manually is possible, but not always efficient. You'll need to fetch, format, and parse S3 access logs before they become useful. AWS offers tools like Amazon Athena to query logs or CloudTrail to complement them. You can also integrate a telemetry pipeline tool like Mezmo for enriched insights and automated analysis.
As a developer or DevOps engineer, understanding how to work with security logs and implement log rotation and analysis strategies is key to securing and scaling your cloud environments.
Related Lessons
Share Article
Ready to Transform Your Observability?
- ✔ Start free trial in minutes
- ✔ No credit card required
- ✔ Quick setup and integration
- ✔ Expert onboarding support