A cronjob on a server rsyncs all files from different locations to a single folder. Then, that folder gets versioned and backed up again, so that we can go back in time. Optionally, everything can be encrypted and sent to a cloud storage for aditional safety.
So how does it work? First of all some configurations:
Let's make a backup_config.sh
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
#!/bin/bash
BACKUPDISK=/home/backupdisk
#the backup folder BACKUPDIR=$BACKUPDISK/backup
#folders by date and time ARCHIVEDIR=$BACKUPDISK/archive
RSYNC_LOCK_FILE=/tmp/rsync.lock
#this script will populate the archive INCREMENTAL=/home/s2/bin/backup/incremental.sh
#how much the disk can be full before deleting old archives MAX_PERCENT_USED=85
Then we need to actually backup some stuff to $BACKUPDIR. The rsync_black.sh is the script that backs up my windows desktop pc (his name is black). The script connects with ssh to the remote pc, and backs up the
The script uses rsync-wrapper.sh, because sometimes files on the running desktop vanish while being backed up, and we want to ignore this error. So we don't use rsync directly to backup, but wrap it with rsync-wrapper.sh:
REAL_RSYNC=/usr/bin/rsync IGNOREEXIT=24 IGNOREOUT='^(file has vanished: |rsync warning: some files vanished before they could be transferred)'
# If someone installs this as "rsync", make sure we don't affect a server run. for arg in "${@}"; do if [[ "$arg" == --server ]]; then exec $REAL_RSYNC "${@}" exit $? # Not reached fi done
set -o pipefail
# This filters stderr without merging it with stdout: { $REAL_RSYNC "${@}" 2>&1 1>&3 3>&- | grep -E -v "$IGNOREOUT"; ret=${PIPESTATUS[0]}; } 3>&1 1>&2
if [[ $ret == $IGNOREEXIT ]]; then ret=0 fi
exit $ret
Like our rsync_black.sh file above, we can create more files like that, to backup other computers. For example rsync_31337.it.sh backs up folders on a remote server:
Now we have all our folders from all our remote computers on $BACKUPDIR. We need to version them. $INCREMENTAL is responsible for that. This is incremental.sh:
in $ARCHIVEDIR we create a lot of folders named with date and time, that contain our backup at a given point in time, so we can recover everything in case of lost data.
When our backupdisk reaches $MAX_PERCENT_USED, we need to clean up, by deleting old archives. To do that, we use remove_old.sh:
So, now if we run rsync_31337.it.sh, all our files from the computer 31337.it get copied over to our $BACKUPDIR, and incremental.sh will then create our backup folder in archive, representing our files at this specific point in time.