You can connect Amazon S3 or any S3-compatible storage service to Livedocs to import files directly into your notebooks. This includes Cloudflare R2, Railway, Google Cloud Storage, DigitalOcean Spaces, Backblaze B2, MinIO, and more.
How to Connect
- Go to the Data tab in your workspace.
- Click Connect a Database.
- In the Database Type dropdown, select S3.
- Fill in the connection details and credentials.
- Click Connect to S3.
Connection Fields Explained
Connection Details
- Name – A friendly name for this connection (e.g.,
Production S3). - Endpoint – The S3 endpoint URL (e.g.,
https://s3.amazonaws.comfor AWS, or your custom endpoint for S3-compatible services). - Region – The AWS region where your bucket is located (e.g.,
us-east-1). - Bucket – The name of the S3 bucket you want to connect.
- Prefix (optional) – A path prefix to limit access to a specific folder (e.g.,
data/exports/).
Options
- Force Path Style – Enable this for S3-compatible services that require path-style URLs instead of virtual-hosted-style URLs. Most S3-compatible services (MinIO, DigitalOcean Spaces, etc.) require this option.
Authentication
- Access Key ID – Your AWS access key ID (e.g.,
AKIAIOSFODNN7EXAMPLE). - Secret Access Key – Your AWS secret access key.
Creating AWS Access Keys
- Sign in to the AWS Management Console.
- Navigate to IAM (Identity and Access Management).
- Click Users in the left sidebar.
- Select an existing user or create a new user.
- Go to the Security credentials tab.
- Click Create access key.
- Select Other as the use case and click Next.
- Copy both the Access Key ID and Secret Access Key.
The secret access key is only shown once. Store it securely. If you lose it, you must create a new access key.
Required IAM Permissions
The IAM user needs at minimum the following S3 permissions:
s3:GetObject– To read filess3:ListBucket– To browse bucket contents
If you want to write data back to S3, also include:
s3:PutObject– To upload files
For simplicity, you can attach the AmazonS3ReadOnlyAccess managed policy, or create a custom policy scoped to your specific bucket.
S3-Compatible Services
This connector works with any S3-compatible storage service:
| Service | Endpoint Example | Region | Force Path Style |
|---|---|---|---|
| AWS S3 | https://s3.amazonaws.com | e.g., us-east-1 | Off |
| Cloudflare R2 | https://<ACCOUNT_ID>.r2.cloudflarestorage.com | auto | Off |
| Railway | https://<BUCKET>.railway.app | auto | Off |
| Google Cloud Storage | https://storage.googleapis.com | e.g., us-east1 | Off |
| DigitalOcean Spaces | https://<REGION>.digitaloceanspaces.com | e.g., nyc3 | On |
| Backblaze B2 | https://s3.<REGION>.backblazeb2.com | e.g., us-west-000 | On |
| MinIO | https://minio.example.com | Your region | On |
Google Cloud Storage Notes
GCS supports S3-compatible access via its XML API interoperability. You’ll need to create HMAC keys (not a service account JSON):
- Go to Cloud Storage Settings in Google Cloud Console.
- Select the Interoperability tab.
- Click Create a key under Access keys for service accounts (or your user account).
- Use the generated Access Key and Secret as your credentials.
Additional Options
- Don’t Have Credentials? – You can invite a teammate who has access to the S3 bucket to connect it for you.
Next Steps:
Resources:
- AWS IAM Access Keys
- Cloudflare R2 S3 API
- Railway Storage Buckets
- Google Cloud Storage Interoperability
Need Help? If something goes wrong, contact support@livedocs.com and we’ll get it sorted.