# Backup Backup Backup

## tuppe666

I missundersood catalyst I got all exited I thought a simple way to backup my hard drive programs, my adsl speedtouch modem which if fragile, my ... other bits and bobs that vanished lats time my lost+found contained all my programs and then pants I don't think it does what I really want  :Sad:  so as I wondered what is the best was of backing up programs on my hard drive so I can get myself up and running as fast as possible back into gentoo. I am surprised there isn't a standard program!

Love and Kisses

Moi

----------

## spamspam

Tar

----------

## NeddySeagoon

tuppe666,

tar to create the archive and bzip2 to compress it if you need the space.

----------

## spamspam

It might also be a really good idea to copy the tarball to offline media such as a tape, Zip Disk, or CDR/DVDR.

----------

## billkr

Is there a way to make tar break up a large archive into chunks of a specified size?  Specifically, can I use tar to break up my photo collection into 700 MB portions to burn onto cd?  I have taken a look at man tar, there is a --tape-length option, but I'm not sure if this is what I'm looking for.

Thanks for any help.

----------

## FrithjofEngel

 *billkr wrote:*   

> Is there a way to make tar break up a large archive into chunks of a specified size?  Specifically, can I use tar to break up my photo collection into 700 MB portions to burn onto cd?  I have taken a look at man tar, there is a --tape-length option, but I'm not sure if this is what I'm looking for.
> 
> 

 

It is  :Smile: 

----------

## bch

Another option is the split command. 

http://www.gnu.org/software/coreutils/manual/html_chapter/coreutils_5.html#SEC20

With this, you can split the tar file into arbitrary size then burn 

these to cdr or dvdr then use cat to join them back together..

brent

----------

## tam1138

 *bch wrote:*   

> Another option is the split command. 
> 
> http://www.gnu.org/software/coreutils/manual/html_chapter/coreutils_5.html#SEC20
> 
> With this, you can split the tar file into arbitrary size then burn 
> ...

 

The bummer with this method is that you need to reassemble all of the parts in order to retrieve a single file (even if that file is contained completely within a single part).

----------

## spacejock

Tar/bz works, but isn't there a more efficient way?

E.g. Use zip to create an archive of /etc

Use zip -u weekly to add changed files 

use zip ??? to remove files no longer present on the source.

Mt FULL-etc-tar.gz files are only 8mb or so, but my FULL-usr.tar.gz is over 2gb.  Re-creating that file every week when 99% of it is the same seems like a big waste of time. 

I also have an rsync command which duplicates important folders onto a second drive nightly, weekly, monthly, but I use a USB disk to carry the entire system backup with daily increments from 3 servers everywhere I go - about 5gb per server. It's VFAT formatted, don't want to make it ext3, so I can't just rsync files onto it and preserve attributes.  Also, I plug this disk in and grab changed tar.bz files, it's not going to sit there for 2 hours waiting to back everything up onto it.

I'm sure there's a way of listing files in the zip, checking if they exist (test -f?) and if not, removing them from the zip. Then add changed files back into the zip to finish off.  Daily increments can just start with a new zip.  I'll have to start investigating...

Cheers

Simon

----------

## n3mo

```
mkdir /backup

cd /

tar cpvf /backup/filename.tar --directory / --exclude=proc --exclude=mnt --exclude=backup --exclude=dev --exclude=*/lost+found .
```

you can put that on a script, except for for the mkdir command and run it weekly, you colud also improve the script using --tape-lenght option or using incremental backup.

----------

## nobspangle

you can't use zip as it won't store important stuff like links and file permissions.

Tar is a very versatile backup program, and will allow you to do incremental backups and split volume backups.

If you're short on space you can also make it bzip2 the archive by simply adding the j option.

----------

## puke

Mondo Rescue allows you to build ISOs to image your linux box.

----------

## spacejock

Thanks for the reply.

From man zip:

-y     Store symbolic links as such in  the  zip  archive,

              instead   of   compressing  and  storing  the  file

              referred to by the link (UNIX only).

However, you're right about the file permissions.

I already use tar for incremental backups, and I use the z option (for gz) because it seems to be a lot faster than bz2 compression, even if the files are a bit bigger.

What I was getting at is that my system goes through a massive routine, re-creating several 1gb+ files, when most of the content is the same.  I do nightly incrementals, I was just looking for a way to cut out so much activity on my server.

In fact, I wrote a script which delves into the first level directory (e.g. var) and then backs up each subfolder tree to a separate file - this splits some of the very large files up and makes copying the files over the network a bit easier.  I guess another option would be do schedule the weekly backup of /var on Monday, /usr on Tuesday, etc.

Cheers

Simon

----------

## Janne Pikkarainen

You might also want to take a look at rsync. It transfers only the changed files and with some shell magic it's also possible to do incremental backups with it.

http://rsync.samba.org/examples.html

----------

## spacejock

"I also have an rsync command which duplicates important folders onto a second drive nightly, weekly, monthly,"

Actually, that should have said 'script', not 'command'.  But I'm using rsync, and it's good.  I wrote something similar for Windows recently, nowhere near as many features but it allow me to sync important folders onto my backup disks, based on file modification time.

Cheers

Simon

----------

## eelleemmeenntt

Hey I have two bash scripts that work differently for backup. One uses rsync and the other rdiff_backup. Feel free to use. I hope they help......

http://publish.uwo.ca/~nelkadri/

----------

## Kraymer

Hi there!

I was looking for something similar and don't want to stick to the tar-bzip2-solution. Looking in portage with '-s backup', I found rdiff-backup which looks quite interesting; although your data won't be compressed automatically.

Sebastian

----------

## blscreen

I use a rdiff-backup server in my small home network and it has saved me several times   :Smile: 

rdiff-backup mirrors your backup-data uncompressed. But if a file changes, rdiff-backup saves the new version together with the compressed differences between the versions, so you can restore files from the past.

The idea of not compressing the main mirror is that in case of a complete loss you can use the backup data as a readonly fileserver in seconds.

The drawback is that you need more space for the initial backup, but after that the backuptree wil grow slowly. I have a 12 GB backup partition for user data, and I can keep increments going back several months.

----------

## rzZzn

I use webmin

----------

## mazirian

I second blscreen's reccomendation of rdiff-backup, that's agreat package.

cpio is also a useful tool also.  From what i understand, it may have some advantages over tar if anything gets corrupted.  That may be bs, I don't know.

I just started using it to back up my home directory with the following:

 *Quote:*   

> 
> 
> find /home/bowman | cpio -o -Hnewc | bzip2 > perciplex-home-bowman-$(date +%F).cpio.bz2
> 
> 

 

----------

