Skip to main content

Google Cloud Storage Setup

Store RabbitMQ backups in Google Cloud Storage (GCS). Authentication uses a service account key file or GKE Workload Identity.

Step 1: Create a GCS Bucket

# Set your project
gcloud config set project my-project-id

# Create the bucket
gcloud storage buckets create gs://my-rabbitmq-backups \
--location=us-east1 \
--uniform-bucket-level-access

# Optional: set a lifecycle rule to delete backups older than 90 days
cat > lifecycle.json << 'EOF'
{
"rule": [
{
"action": { "type": "Delete" },
"condition": { "age": 90 }
},
{
"action": { "type": "SetStorageClass", "storageClass": "NEARLINE" },
"condition": { "age": 30 }
}
]
}
EOF
gcloud storage buckets update gs://my-rabbitmq-backups --lifecycle-file=lifecycle.json

Step 2: Create a Service Account

# Create the service account
gcloud iam service-accounts create rabbitmq-backup \
--display-name="RabbitMQ Backup Service Account"

# Grant Storage Object Admin on the bucket
gcloud storage buckets add-iam-policy-binding gs://my-rabbitmq-backups \
--member="serviceAccount:rabbitmq-backup@my-project-id.iam.gserviceaccount.com" \
--role="roles/storage.objectAdmin"

# Create and download a key file
gcloud iam service-accounts keys create /path/to/rabbitmq-backup-sa-key.json \
--iam-account=rabbitmq-backup@my-project-id.iam.gserviceaccount.com

Step 3: Configure rabbitmq-backup

Using a Service Account Key File

Set the GOOGLE_APPLICATION_CREDENTIALS environment variable or specify the path in the config:

backup-gcs.yaml
mode: backup
backup_id: "gcs-backup-001"

source:
amqp_url: "amqp://backup_user:password@rabbitmq.example.com:5672/%2f"
management_url: "http://rabbitmq.example.com:15672"
management_username: backup_user
management_password: password
queues:
include:
- "*"

storage:
backend: gcs
bucket: my-rabbitmq-backups
service_account_path: /path/to/rabbitmq-backup-sa-key.json
prefix: prod/

backup:
compression: zstd
include_definitions: true

Or use the environment variable:

export GOOGLE_APPLICATION_CREDENTIALS=/path/to/rabbitmq-backup-sa-key.json
rabbitmq-backup backup --config backup-gcs.yaml

When using the environment variable, you can omit service_account_path from the config:

storage:
backend: gcs
bucket: my-rabbitmq-backups
prefix: prod/

Using GKE Workload Identity

On GKE with Workload Identity, bind the Kubernetes ServiceAccount to the GCP service account:

# Allow the K8s SA to impersonate the GCP SA
gcloud iam service-accounts add-iam-policy-binding \
rabbitmq-backup@my-project-id.iam.gserviceaccount.com \
--role roles/iam.workloadIdentityUser \
--member "serviceAccount:my-project-id.svc.id.goog[rabbitmq-backup/rabbitmq-backup]"

Annotate the Kubernetes ServiceAccount:

serviceaccount.yaml
apiVersion: v1
kind: ServiceAccount
metadata:
name: rabbitmq-backup
namespace: rabbitmq-backup
annotations:
iam.gke.io/gcp-service-account: rabbitmq-backup@my-project-id.iam.gserviceaccount.com

No key file is needed -- the GKE metadata server provides credentials automatically. Simply omit service_account_path from the config.

GCS Configuration Reference

FieldRequiredDefaultDescription
backendYes--Must be gcs
bucketYes--GCS bucket name
service_account_pathNoGOOGLE_APPLICATION_CREDENTIALS envPath to service account JSON key
prefixNoNoneKey prefix for all objects

Verify the Setup

rabbitmq-backup backup --config backup-gcs.yaml
rabbitmq-backup list --path gcs://my-rabbitmq-backups

Check the bucket directly:

gcloud storage ls gs://my-rabbitmq-backups/prod/ --recursive

Docker Usage

When running in Docker, mount the service account key file:

docker run --rm \
-v $(pwd)/backup-gcs.yaml:/config/backup.yaml:ro \
-v /path/to/rabbitmq-backup-sa-key.json:/secrets/sa-key.json:ro \
-e GOOGLE_APPLICATION_CREDENTIALS=/secrets/sa-key.json \
rabbitmq-backup:latest \
backup --config /config/backup.yaml