New user's registration have been closed due to high spamming and low trafic on this forum. Please contact forum admins directly if you need an account. Thanks !

Off-site backup

A collection of tips on howto tweak your Bubba.
Post Reply
stasheck
Posts: 126
Joined: 15 Jan 2014, 13:13

Off-site backup

Post by stasheck »

Since I got so much help from this forum (hat tip to Gordon, Sakaki, MouettE and Ubi :-) ), I thought I'll share my off-site backup "system" for backing up remote Windows server.

It's going to be a long post :-)

1. What and why.

I'm doing IT for a small bookkeeping/financial company. They are running (among other things) MS SQL Server, on which 90% of their software keeps their customer's databases. Needless to say, if they lose this data, it's going to be sad. Most of it is "restorable" from hardcopies, but the amount of work...

So of course I'm doing backup of this. Actually, 2 backups:
a) entire server, sans last partition, gets backup to a QNAP installed locally in the office. Standard MS Backup to an iSCSI share (since normal network share does not allow for versioning). Saved my bottom twice already.
b) all of the databases are backup using internal MS SQL Backup function to the last partition, for versioning and quick restore.

This works well enough, but there are still 2 failure modes not covered with this:
a) something happens to the office - burglary, fire, flood, whatever
b) the staff is not reliable when it comes to IT security, so it's possible to get infected by Cryptolocker or something similar

Solution: off-site backup. Additional requirements - the backup server has to act as "Active", because:
a) the office network has some public static IPs, while off-site would be running off a DSL/cable with random IPs
b) if it's the server that connects, then Cryptolocker can't reach to it and encrypt anything on the backup server.

So, how to do this?

2. Prepare "source" server.

Unfortunately, the work was done quite a while ago, and I can't remember what guides I used. In general:

a) install Cygwin on the server, with openssh server
b) generate keys for a user you want to run backup with (I think it has to be domain-joined user, but I won't bet anything on it)
c) test if you can login with keys
d) modify SSH config to only allow single user, and only allow key-based auth; then restart the service

Code: Select all

PasswordAuthentication no
AllowUsers <your_user>
e) setup port forwarding on your router
f) check if you can login from outside with keys, but not with password
g) (optional) install script that removes oldest SQL backups, but leaves ones made on Sunday (see below)

Code: Select all

#/bin/sh

# from https://stackoverflow.com/questions/20034415/bash-delete-all-files-older-than-1-month-but-leave-files-from-mondays

export q="'"
# hack from https://unix.stackexchange.com/questions/187651/how-to-echo-single-quote-when-using-single-quote-to-wrap-special-characters-in

find "/cygdrive/<path_to_SQL_backup_dir>" -type f -mtime +90 -exec sh -c 'test $(LC_TIME=C date +%a -r "$1") = Sun || echo rm "${q}$1${q}"' -- {} \; | sh
Yes, it's kinda ugly; it also works well for my purposes.

3. Prepare backup server.

a) install script that runs the backup job

Code: Select all

#!/usr/bin/bash

# Inspired by: https://netfuture.ch/2013/08/simple-versioned-timemachine-like-backup-using-rsync/
# Usage: timemachine.sh <label>

BACKUP_DIR="/home/company/backup"
LOG_FILE="$BACKUP_DIR/timemachine_job.log"
LOG_FILE_RM="$BACKUP_DIR/rm.log"
MINIMAL_FREE_SPACE=150000000

if [ "$#" -ne 1 ]; then
    /usr/bin/echo "$0: Expected 1 argument, received $#: $@"
    exit 1
fi

/usr/bin/echo `/usr/bin/date`" | Starting job."  >> $LOG_FILE

/usr/bin/echo `/usr/bin/date`" | Trying to remove old backups, except for those made on Sunday."  >> $LOG_FILE
# remove old backups, except for those made on Sunday - runs on remote machine
/usr/bin/ssh -p 22 user@server ./remove_old_dumps.sh
/usr/bin/echo `/usr/bin/date`" | Old SQL dumps removed from source machine" >> $LOG_FILE

# remove oldest local backups, until there's at least 150 GB of free space for new backup
# from https://unix.stackexchange.com/questions/28939/how-to-delete-the-oldest-directory-in-a-given-directory
DISK_FREE=`/usr/bin/df $BACKUP_DIR | /usr/bin/tail -1 | /usr/bin/awk '{print $4}'`

# find latest backup, we never want to remove this one
IFS= read -r -d $'\0' line < <(/usr/bin/find $BACKUP_DIR -maxdepth 1 -type d -name 2* -printf '%T@ %p\0' | /usr/bin/sort -r -z -n)
DIR_TO_KEEP="${line#* }"

while [[ $DISK_FREE -lt $MINIMAL_FREE_SPACE ]];
do
    /usr/bin/echo `/usr/bin/date`" | Freeing up local diskspace, current free space: $DISK_FREE bytes." >> $LOG_FILE

    # from https://unix.stackexchange.com/questions/28939/how-to-delete-the-oldest-directory-in-a-given-directory
    # limit search only to directories starting with "2"
    IFS= read -r -d $'\0' line < <(/usr/bin/find $BACKUP_DIR -maxdepth 1 -type d -name 2* -printf '%T@ %p\0' 2>/dev/null | /usr/bin/sort -z -n)
    DIR_TO_REMOVE="${line#* }"

    if [[ "$DIR_TO_REMOVE" == "$DIR_TO_KEEP" ]];
    then
        /usr/bin/echo `/usr/bin/date`" | Not removing latest backup directory $DIR_TO_REMOVE, will attempt to make backup with potentially insufficient disk space." >> $LOG_FILE
        break
    fi

    /usr/bin/echo `/usr/bin/date`" | Removing directory $DIR_TO_REMOVE." >> $LOG_FILE

    # call me paranoid, but I'd *really* like to avoid removing anything outside of $BACKUP_DIR
    if [[ "$DIR_TO_REMOVE" =~ "$BACKUP_DIR" ]];
    then
        /usr/bin/rm -rfv $DIR_TO_REMOVE 2>&1 >$LOG_FILE_RM
        if [[ $? != "0" ]]; then
            /usr/bin/echo `/usr/bin/date`" | Removing directory $DIR_TO_REMOVE encountered issues, check $LOG_FILE_RM" >> $LOG_FILE
            /usr/bin/echo `/usr/bin/date`" | Bailing out." >> $LOG_FILE
            exit 2
        fi
    else
        /usr/bin/echo `/usr/bin/date`" | Directory to remove - $DIR_TO_REMOVE - does not include backup dir, something is wrong." >> $LOG_FILE
        /usr/bin/echo `/usr/bin/date`" | Bailing out." >> $LOG_FILE
        exit 3
    fi

    DISK_FREE=`/usr/bin/df $BACKUP_DIR | /usr/bin/tail -1 | /usr/bin/awk '{print $4}'`
done

/usr/bin/echo `/usr/bin/date`" | Free space prior to backup start: $DISK_FREE bytes." >> $LOG_FILE


if [ -d "$BACKUP_DIR/__prev/" ]; then
    /usr/bin/rsync -z -a --no-perms --no-owner --no-group --delete --link-dest="$BACKUP_DIR/__prev" --exclude-from=/home/company/rsync-exclude.txt -r -t -e "ssh -p 22" user@server:/cygdrive/f/ "$BACKUP_DIR/$1"
else
    /usr/bin/rsync -z -a --no-perms --no-owner --no-group --exclude-from=/home/company/rsync-exclude.txt -r -t -e "/usr/bin/ssh -p 22" user@server:/cygdrive/f/ "$BACKUP_DIR/$1"
fi
/usr/bin/echo `/usr/bin/date`" | Rsync complete." >> $LOG_FILE

/usr/bin/rm -f "$BACKUP_DIR/__prev"
/usr/bin/ln -s "$BACKUP_DIR/$1" "$BACKUP_DIR/__prev"

/usr/bin/echo `/usr/bin/date`" | Directories re-linked, I'm done." >> $LOG_FILE
There's quite a number of built-in failsafes, each one commented in the code. Works reliably for me, but of course there's some room for improvement.

b) (optional) install rsync-exclude.txt file

Code: Select all

$RECYCLE.BIN/
System Volume Information/
c) create crontab job (crontab -e)

Code: Select all

0 4 * * * /home/company/timemachine.sh `/usr/bin/date +\%Y-\%m-\%d` 2>&1 >/home/company/backup/timemachine.log
Also: I spent 2 weeks before I realized I'm not escaping "%" in crontab. Do not repeat my mistake :-)

4. Lean back and relax, observing backup done every day :-)
Hope it helps someone :-) All comments are welcome.
Last edited by stasheck on 11 Feb 2016, 08:01, edited 2 times in total.
Gordon
Posts: 1461
Joined: 10 Aug 2011, 03:18

Re: Off-site backup

Post by Gordon »

Nice

Might want to mention that the backup script as presented is set to backup F:\ (/cygdrive/f) and subdirectories on the Windows server. Might be better to have machine name/ip and source folder as parameters to the script as well.

I actually do it the other way round. Let the Windows machine decide when to start the backup. It means I do not have to run an ssh server on the Windows machine, but your method will work better when backing up multiple machines - serving them one at the time rather than having them clog up the resources of your B3 by all running at the same time. Also like that "timemachine" option in there; I currently have my (Windows) script create the hard links prior to running rsync. Thanks for putting this to my attention.

Don't know though if using compression is actually the sensible thing to do (although I do have that in my command as well). Depending on the line speed, which will be no issue at all in any LAN environment, you're likely to see this running a lot faster without compression.
stasheck
Posts: 126
Joined: 15 Jan 2014, 13:13

Re: Off-site backup

Post by stasheck »

I measured both with and without compression, not much difference actually. For me, the link seems to be the limiting factor.

Yes, the script could use a bit more parametrization, I might add some when I need it :-)

I think I also explained why I do it this way around. As usual, I had a requirement that basically noone in the rest of the Internet had :-) Seems to me that the entire US of A runs off public static IPs... :-) Not so much in Poland.

I actually have another machine to be setup like this, but in their case BOTH the company and the backup machine run behind NAT on dynamic IPs... I guess I'll have to "glue" them using my own public IP and 2 SSH tunnels.
Gordon
Posts: 1461
Joined: 10 Aug 2011, 03:18

Re: Off-site backup

Post by Gordon »

stasheck wrote:As usual, I had a requirement that basically noone in the rest of the Internet had :-) Seems to me that the entire US of A runs off public static IPs... :-) Not so much in Poland.
That seems over-exaggerated, because then there would be no demand for dyndns services such as bubba easyfind. Looking at the mess that Microsoft makes from their Windows platform it does appear though that many people run their workstations with a public IP assigned to it. In fact here in the Netherlands that was also the case with people who got their internet from a cable provider. Not any more though - they all supply masquerading (wifi) routers that allow multiple computers to be attached.

Liking the idea of turning control over to the B3 I did stumble on some warnings though about running cygwin based ssh server on Windows. Seems to mess with Windows update or so. Did you encounter such problems?
stasheck
Posts: 126
Joined: 15 Jan 2014, 13:13

Re: Off-site backup

Post by stasheck »

Frankly, no. I had my own doubts about running cygwin, but I'm testing on for last half a year on my production workstation and it doesn't seem to cause any issues. Care to share any links?
Gordon
Posts: 1461
Joined: 10 Aug 2011, 03:18

Re: Off-site backup

Post by Gordon »

Feeling a bit stupid now about not saving the links, but went straight for another SSH server for Windows: FreeSSHD. Bonus: it integrates with Windows security. This one won't allow me to initiate rsync remotely though. It may still work if I use ssh to tell the windows box to initiate rsync however, but I have not made any such attempt yet.
stasheck
Posts: 126
Joined: 15 Jan 2014, 13:13

Re: Off-site backup

Post by stasheck »

Why won't it allow rsync? I don't think that rsync on remote machine is a must?
Gordon
Posts: 1461
Joined: 10 Aug 2011, 03:18

Re: Off-site backup

Post by Gordon »

Don't know. Some cryptic message about a protocol mismatch, but if I trigger rsync from the Windows machine it runs through without issues. Also don't really know how rsync integrates with ssl, but it is clear that the FreeSSHD server breaks it.
Post Reply