I admit I don't backup my data enough, at least not on my servers. My workstations use Time Machine, but my servers tend to not be backuped enough. Today I made a simple backup for my webserver.
The script looks like this.
#!/bin/bash
mysqldump -u wordpress --password=[password] wordpress > /root/backup/wordpress-`date +'%d-%m-%Y'`.sql
mysqldump -u piwik --password=[password] piwik > /root/backup/piwik-`date +'%d-%m-%Y'`.sql
tar czf /root/backup/htdocs-backup-`date +'%d-%m-%Y'`.tar.gz /opt/local/share/httpd/htdocs/
scp /root/backup/* [user]@[destination]:/volume1/homes/[username]/backup
rm -rf /root/backup/*
I've blanked out usernames and passwords, but it basically dumps the wordpress and piwik database, tars/compresses the webserver root and then uploads it to my Synology NAS via scp. The last step is to delete the files from the webserver.
To make sure it doesn't use too much space on my NAS I've set a max limit of keeping 14 days of backups. I run this command via crontab on the NAS to delete all backups more than 14 days old.
find /volume1/homes/[username]/backup -mtime +14 -exec rm {} \;