New user's registration have been closed due to high spamming and low trafic on this forum. Please contact forum admins directly if you need an account. Thanks !

Time Machine backup your Bubba users

A collection of tips on howto tweak your Bubba.
Post Reply
Nöbbe
Posts: 13
Joined: 19 Feb 2007, 06:22
Location: Lund, Sweden
Contact:

Time Machine backup your Bubba users

Post by Nöbbe »

I have made a script that makes the BubbaServer (may work for you Bubba Two, but I don't have one) do backups in a Time Machine-like way, I have used this page as a guide on how to do the setup:

http://blog.interlinked.org/tutorials/r ... chine.html

This script will create incremental backups with hard links to earlier backups (unless it is the first time, when everything will be copied). Each time you start a backup a folder is created with current date and time, then the backup is performed into this folder and finally is a soft link "current" is created that points at the latest backup. This has the advantage that you can remove any time folder you want and as long as a file still is linked to from a different time folder it will not be deleted, and if you don't have any changes it will only create a folder full with links (In my case my 60GB with data only consumes about 12MB of links for each backup).

The script will backup everything in the "private" folder of a user that has a file named "backup.txt" in the home folder. That file can contain exclude patterns for the backup. I have still not figured out everything about exclude and rsync, but it seems as if I must use the "private" folder as root, meaning if you want to exclude the folder "doneloads" for the user "kalle" you do NOT add "/home/kalle/private/doneloads/" to the file, but rather "/doneloads/". It would be nice if someone could confirm this.

In my case I backup everything to an external USB harddrive where I have created an ext3 partition that I have mounted on /home/backup. Every user will have their backups in their /home/backup/"user"/ folder.

By adding the script to crontab you can make snapshot copies of your data as often as you like.

And another nice thing - rsync will preserve file permissions, owner and group as long as you backup to a non-FAT/NTFS partition.

This is how you do it:

1. Add the normal repository for debian
In the file /etc/apt/sources.list add the following repository (as root, using for example nano)

Code: Select all

deb http://archive.debian.org/debian/ sarge contrib main non-free
2. Install rsync
sudo apt-get update
sudo apt-get install rsync

3. This is the script that does the work. I am not very used to do bash scripts so if someone could help me with guidelines regarding uppercase and lowercase use in variable names that would be very much appreciated. I have put this in the/root/bin/ folder, using the filename backup.sh and the given it run permission (as root):
chmod +x backup.sh

The script:

Code: Select all

#!/bin/sh

date=`date "+%Y-%m-%dT%H:%M:%S"`

# File that should exist if we should do a backup

backupfile="backup.txt"

# Where does the list of backups come from?
# Each user should keep a backup.txt in their home root if they want their 
# private folder to be backuped.

# 1. Create a list of users.
# 2. If that user has a backup.txt, backup the user.

users="`ls /home/`"

for user in $users;do

  userhome="/home/$user/private"
  dobackupfile="/home/$user/$backupfile"

  if [ -f "$dobackupfile" ]; then
    if [ -d $userhome ]; then
      echo User: $user. Backup this user.
      backuphome="/home/backup/$user"
      currentbackup="$backuphome/current"
      datebackuphome="$backuphome/$date"
      echo $backuphome, $currentbackup, $userhome

      # If there doesn't exist a current backup, create it.
      if [ ! -d $currentbackup ]; then
        echo Creating new backup. Creating $backuphome
        mkdir $backuphome
        rsync -aP --exclude-from=$dobackupfile $userhome $datebackuphome
      else
        # Create a new backup
        rsync -aP --link-dest=$currentbackup --exclude-from=$dobackupfile $userhome $datebackuphome
      fi

      # Update the "current" folder/link to the new backup.
      rm $currentbackup
      ln -s $datebackuphome $currentbackup

    else
      echo No folder to backup, missing $userhome
    fi
  else
    echo User: $user.
  fi
done
4. Add the script to crontab

sudo crontab -e

Add the line:

Code: Select all

0 0 * * 0 /root/bin/backup.sh >> /root/bin/log.txt 2>&1
This will run the script once a week (sunday 00:00?) and route all output, including errors, to the log.txt file.

5. Make sure all users now about the backup.txt, or simply add it to the users you want!

Please note that since I have a BubbaServer and backuping to a USB harddrive it is not very fast the first time. USB2.0 fullspeed is 12MBit/s I believe, which is ~1MB/s and with some overhead it ends up with speeds around 700kb/s. So the first time I run the script it took about 36 hours for my 110GB of data. After that I excluded my movies, which I wish I would have done earlier.

But an incremental backup with no changes takes less then 10 seconds, and that is for about 30 000 files.

Improvements and corrections are most welcome!

Good Luck!
Post Reply