Here’s a complete bash script for performing a PostgreSQL database backup using pg_dump, allowing the user to specify a backup path and sending it to an S3-compatible service using aws-cli (or compatible tool like s3cmd).
You will need the following tools installed:
pg_dump(PostgreSQL client tool)aws-cli(ors3cmdfor S3 interaction)
How It Works:
- Database List: You can pass a list of databases as arguments when running the script. The script will loop through each database and perform the backup.
- Backup Directory: The
BACKUP_PATHis a default local directory where backups are stored before uploading. - Backup Retention: The script will delete old backups that are older than the retention policy (default: 7 days).
- AWS CLI for S3: The script uses
aws s3 cpto upload the backups to an S3-compatible service. - Automatic Cleanup: After uploading to S3, it cleans up any backup files older than the defined retention days.
To ensure the script runs as the mastodon user, you need to set the cron job under the mastodon user account. Here's how you can do that:
1. Switch to the mastodon User
First, switch to the mastodon user by running:
sudo su - mastodon
This switches to the mastodon user account.
2. Edit the Crontab for the mastodon User
Now, edit the crontab for the mastodon user:
crontab -e
3. Add the Cron Job Entry
Add the following line to run the backup script at midnight every day:
0 0 * * * /path/to/pg_multi_backup_s3.sh db1 db2 db3 >> /path/to/logs/pg_backup.log 2>&1
- Replace
/path/to/pg_multi_backup_s3.shwith the actual path to your script. - Replace
db1 db2 db3with the actual database names. - Replace
/path/to/logs/pg_backup.logwith the path to the log file (create a directory if it doesn't exist).
4. Save and Exit
Save and exit the crontab editor.
5. Verify the Cron Job
To check that the cron job has been added for the mastodon user, run:
crontab -l
You should see the scheduled job listed. This ensures that the backup script will be run as the mastodon user every day at midnight.
To securely add AWS S3 credentials and configure the endpoint for the backup script, you have a couple of options. The recommended approach is to use the AWS credentials file or set the credentials as environment variables. Here's how you can do both.
Option 1: Use AWS Credentials File
-
Create or Edit the AWS Credentials File: Create a file at
~/.aws/credentialsif it doesn't exist, and add your AWS credentials:
[default]
aws_access_key_id = YOUR_ACCESS_KEY_ID
aws_secret_access_key = YOUR_SECRET_ACCESS_KEY
- Create or Edit the AWS Config File: Create a file at
~/.aws/configif it doesn't exist, and add your desired region and endpoint (if using a custom S3-compatible service):
[default]
region = us-east-1
endpoint_url = https://s3.us-west-1.amazonaws.com # Change to your endpoint if using a compatible service