You can adapt this and use for another database.
nano ~/.my.cnf
[client]
user=your_username
password=your_password
chmod 600 ~/.my.cnf
mkdir ~/backups
Replace "containerid" with your container id
docker cp ~/.my.cnf containerid:/root/.my.cnf
docker exec -it containerid chmod 600 ~/.my.cnf
curl <https://rclone.org/install.sh> | sudo bash
follow the steps with your cloud provider credentials
rclone config
After config you can see and edit the config file in ~/.config/rclone/rclone.conf
You can see the script files on scripts folder to download, but if you want to copy follow the steps
nano ~/backups/db-backup.sh
If not using Docker:
# !/bin/bash
FILENAME=$(date +%Y-%m-%dT%H:%M:%S).sql
DATABASE=database_name
BUCKET=bucket_name
BK_PATH=~/backups
RCLONE_STORAGE=your_rclone_storage_name
mysqldump --single-transaction --skip-lock-tables --quick $DATABASE > $BK_PATH/$FILENAME
gzip $BK_PATH/$FILENAME
rclone move $BK_PATH/$FILENAME.gz $RCLONE_STORAGE:$BUCKET/
With docker:
# !/bin/bash
FILENAME=$(date +%Y-%m-%dT%H:%M:%S).sql
DATABASE=database_name
BUCKET=bucket_name
CONTAINER=container
BK_PATH=~/backups
RCLONE_STORAGE=your_rclone_storage_name
docker exec -it $CONTAINER mysqldump --single-transaction --skip-lock-tables --quick $DATABASE > $BK_PATH/$FILENAME
gzip $BK_PATH/$FILENAME
rclone move $BK_PATH/$FILENAME.gz $RCLONE_STORAGE:$BUCKET/
rclone delete --min-age 30d $RCLONE_STORAGE:$BUCKET/
chmod +x backup.sh
~/backups/db-backup.sh
crontab -e
On this example this will run every day at midnight I you want to configure another times, visit Cron Tab Guru to help.
0 0 ** * ~/backups/db-backup.sh >/dev/null 2>&1