DB Backup
Pipeline
name: db-backupdescription: "Dump a PostgreSQL database, compress it, and upload to S3"steps: - id: timestamp run: "date +%Y%m%d-%H%M%S"
- id: credentials run: "aws secretsmanager get-secret-value --secret-id prod/db --query SecretString --output text" sensitive: true
- id: dump run: "PGPASSWORD=$PIPE_CREDENTIALS pg_dump -h db.internal -U app -d myapp -Fc -f /tmp/backup-$PIPE_TIMESTAMP.dump" depends_on: "credentials"
- id: compress run: "gzip /tmp/backup-$PIPE_TIMESTAMP.dump" depends_on: "dump"
- id: upload run: "aws s3 cp /tmp/backup-$PIPE_TIMESTAMP.dump.gz s3://my-backups/postgres/$PIPE_TIMESTAMP.dump.gz" depends_on: "compress" retry: 3
- id: cleanup run: "rm -f /tmp/backup-$PIPE_TIMESTAMP.dump.gz" depends_on: "upload"Concepts demonstrated
- Sensitive data —
credentialsis markedsensitive: trueso the database password is never written to the state file - Output passing —
$PIPE_TIMESTAMPand$PIPE_CREDENTIALSflow to downstream steps - Dependencies — strict linear chain ensures steps run in order
- Retry — S3 upload retries 3 times on failure
- Resuming — if upload fails,
--resumere-fetches credentials (sensitive) and retries from the failed step