Amazon S3

Connect to Amazon S3, Cloudflare R2, Railway, GCS, and other S3-compatible storage

You can connect Amazon S3 or any S3-compatible storage service to Livedocs to import files directly into your notebooks. This includes Cloudflare R2, Railway, Google Cloud Storage, DigitalOcean Spaces, Backblaze B2, MinIO, and more.


How to Connect

  1. Go to the Data tab in your workspace.
  2. Click Connect a Database.
  3. In the Database Type dropdown, select S3.
  4. Fill in the connection details and credentials.
  5. Click Connect to S3.

Connection Fields Explained

Connection Details

  • Name – A friendly name for this connection (e.g., Production S3).
  • Endpoint – The S3 endpoint URL (e.g., https://s3.amazonaws.com for AWS, or your custom endpoint for S3-compatible services).
  • Region – The AWS region where your bucket is located (e.g., us-east-1).
  • Bucket – The name of the S3 bucket you want to connect.
  • Prefix (optional) – A path prefix to limit access to a specific folder (e.g., data/exports/).

Options

  • Force Path Style – Enable this for S3-compatible services that require path-style URLs instead of virtual-hosted-style URLs. Most S3-compatible services (MinIO, DigitalOcean Spaces, etc.) require this option.

Authentication

  • Access Key ID – Your AWS access key ID (e.g., AKIAIOSFODNN7EXAMPLE).
  • Secret Access Key – Your AWS secret access key.

Creating AWS Access Keys

  1. Sign in to the AWS Management Console.
  2. Navigate to IAM (Identity and Access Management).
  3. Click Users in the left sidebar.
  4. Select an existing user or create a new user.
  5. Go to the Security credentials tab.
  6. Click Create access key.
  7. Select Other as the use case and click Next.
  8. Copy both the Access Key ID and Secret Access Key.

The secret access key is only shown once. Store it securely. If you lose it, you must create a new access key.


Required IAM Permissions

The IAM user needs at minimum the following S3 permissions:

  • s3:GetObject – To read files
  • s3:ListBucket – To browse bucket contents

If you want to write data back to S3, also include:

  • s3:PutObject – To upload files

For simplicity, you can attach the AmazonS3ReadOnlyAccess managed policy, or create a custom policy scoped to your specific bucket.


S3-Compatible Services

This connector works with any S3-compatible storage service:

ServiceEndpoint ExampleRegionForce Path Style
AWS S3https://s3.amazonaws.come.g., us-east-1Off
Cloudflare R2https://<ACCOUNT_ID>.r2.cloudflarestorage.comautoOff
Railwayhttps://<BUCKET>.railway.appautoOff
Google Cloud Storagehttps://storage.googleapis.come.g., us-east1Off
DigitalOcean Spaceshttps://<REGION>.digitaloceanspaces.come.g., nyc3On
Backblaze B2https://s3.<REGION>.backblazeb2.come.g., us-west-000On
MinIOhttps://minio.example.comYour regionOn

Google Cloud Storage Notes

GCS supports S3-compatible access via its XML API interoperability. You’ll need to create HMAC keys (not a service account JSON):

  1. Go to Cloud Storage Settings in Google Cloud Console.
  2. Select the Interoperability tab.
  3. Click Create a key under Access keys for service accounts (or your user account).
  4. Use the generated Access Key and Secret as your credentials.

Additional Options

  • Don’t Have Credentials? – You can invite a teammate who has access to the S3 bucket to connect it for you.

Next Steps:


Resources:


Need Help? If something goes wrong, contact support@livedocs.com and we’ll get it sorted.