New user's registration have been closed due to high spamming and low trafic on this forum. Please contact forum admins directly if you need an account. Thanks !

Backup taking VERY long

Got problems with your B2 or B3? Share and get helped!
Post Reply
fmvaz
Posts: 13
Joined: 27 Nov 2009, 05:42

Backup taking VERY long

Post by fmvaz »

Hi,

I am backing up my 152 GB of data on bubba using the backup function in file manager on a external USB HD. I started on the 17th of December and 10 days later it still is going strong, >9000 x 5 MB files amounting to 46 GB.
Is it normal that this backup process takes so incredibly long?
Is there a way to speed this up?

This is not a very attractive backup solution... Wondering how long the incremental backup will take...

/F
RandomUsername
Posts: 904
Joined: 09 Oct 2009, 18:49

Re: Backup taking VERY long

Post by RandomUsername »

The built-in backup facility uses Duplicity as its back end and it is indeed very, very slow, even when backing up over USB.

There's not much you can do about it except wait it out or use something else (rsync for example, but you'd have to do that from the command line).

The Excito guys are aware that the backup option in the File Manager isn't very adequate and are exploring alternatives. See here: http://forum.excito.net/viewtopic.php?f=9&t=2035
fmvaz
Posts: 13
Joined: 27 Nov 2009, 05:42

Re: Backup taking VERY long

Post by fmvaz »

Thanks for pointing that out RandomUsername. Saw the thread but didn't read it well enough to get to that information. I'm going to sit and wait for the backup to end but given the bugs in duplicity and the non optimal bubba configuration I do not have a lot of faith that it will be a reliable backup. I'll just do ordinary copy backups until the bubba solution is available.
Golodh
Posts: 6
Joined: 17 Jul 2007, 10:16
Location: London

Processor seems to be the bottleneck ...

Post by Golodh »

Sorry if I'm saying something obvious, but I just tumbled onto the "slow backup" problem too.

I have 350 Gb. to back up and I tried doing that via the Bubba administrator menu. I'm using a 1 Tb. WD disk in an external-disk casing, connected via an eSATA cable to my Bubba.

After about 13 hours it has written about 16 Gb. to disk. I found that the volume where Bubba had mounted the external disk contains lots and lots of 5Mb ".difftar.gz" files, of which about 2-4 were written per minute. Say 15 Mb/min. of compressed files written to external disk. Now those 350 Gb consists for approximately 50% of zip files (which won't compress much further), and 50% of executables(which ought to compress), ascii files (which ought to compress fine), pictures etc. (which probably won't compress much).

Which means that I can estimate that the size of a complete backup will be about 0.5 * 350 Gb (for uncompressable stuff) + 0.5 * (350 /10) (generously assuming a 10:1 compression ration on all the rest) = 175Gb. + 17.5Gb = 192.5 Gb. Assuming a constant rate of 15Mb/min, this gives me 192.5 Gb. / 15 Mb/min = 13141 minutes, or 9.1 days. In other words: it takes too long.

Next I spent a little time trying to identify the bottleneck. The eSATA connection should carry 150 Mbit/sec (at least), or about 1 Gb/min. So that's probably not the problem.

Then I ran top from the console on Bubba and found a CPU utilization of about 97%. To me it looks as if the backup process is CPU-bound. People at Excito probably knew that already, but just in case I thought I'd mention it. It might be very hard to find a standard algorithm (like gzip) that will allow the ARM chip to compress 1000 Gb worth of data with acceptable speed. I know nothing of the "duplicity" software that's used, but if it relies on tar and gzip it looks like a non-starter, no matter what bugs are fixed. Perhaps it's possible to use a "quick" version of gzip that doesn't compress much, or perhaps it's sufficient just to calculate checksums and not do any compression at all. At worst Bubba's processor might need to be upgraded (at the cost of higher power consumption), or there might arise the need for a really zippy ARM-specific optimized compression routine.


In the mean time I have to do a full backup (I just found out I probably have one of the affected WD Green HDD's, and I never bothered with backups before ... Ok, Ok, I know. I'm guilty of carelessness and negligence). As an interim solution I'll do an "ordinary copy" backup as (fmvaz calls it): I'm simply going to "back up" my Bubba's disk to the (ext-3 formatted) external disk using "cp". At first glance Bubba's CPU utilization is at 80% during the copy operation, so the bottleneck is no longer in the CPU.


Cheers,

Golodh
RandomUsername
Posts: 904
Joined: 09 Oct 2009, 18:49

Re: Backup taking VERY long

Post by RandomUsername »

The "difftar" files are produced by Duplicity.

You're right in that the CPU is probably the bottleneck in this situation. I'd suggest you use rsync (with the -a option) rather than cp to do a straight copy. It will resume failed backups more efficiently and copy certain files that cp might miss (Linux system files etc).

rsync doesn't compress data so it's quite fast (you can use the -z option to compress data for transfer (good over an Internet connection for example) but I wouldn't bother if all you're doing is transferring it over eSATA.
pa
Posts: 308
Joined: 06 Dec 2006, 04:38
Location: Sweden
Contact:

Re: Backup taking VERY long

Post by pa »

Hi,

We agree that the backup solution for Bubba is not optimal for backup of tons of relatively static data. A "simple" rsync job would be a better solution for this purpose and is something that we have on our "Requested feature" list already. Just that time is unfortunately not as endless as our wish list...

The main target with the backup was to secure active documents and folders that were used and changed on quite often, and with the possibility to have a history of the backup, not just a snapshot.

Regards
PA Nilsson, Excito.
Post Reply