# Remote backup via SSH?

## rev138

Hi,

I have an idea, and I'm wondering if it's possible. If not, I welcome alternate suggestions.

I would like to back up data on one of my servers to another. The simplest approach I can think of is to write a script which creates a tarball of the files and then copies them to the remote server via SCP. However, this is problematic because the originating server needs enough disk space both for it's own data, and for the backup tarball (at least temporarily).

It would be great if I could "stream" the tar output across SSH (or SCP) so that the tarball is written to the remote server directly. I know you can tunnel network services through SSH, but what about things like this?

Thoughts?

----------

## rev138

```
$ tar -cjvpf - /path/to/files | ssh user@destination "(cat - > /destination/path/backup.tbz2)"
```

Works  :Smile: 

----------

## mudrii

I think with scp should be more easy to set up.

Regards

----------

## rev138

 *mudrii wrote:*   

> I think with scp should be more easy to set up.
> 
> Regards

 

Feel free to offer a sample command line then  :Smile: 

----------

## rev138

I have another question:

I'm also trying to stream a mysql dump over ssh. The following code works, but I'd like to compress the data dump before it's tunnelled through ssh, for obvious reasons.

```
 $ mysqldump -u user --password=password database | ssh user@server "(cat - | gzip > database.sql.gz)"
```

I'm open to using alternatives to gzip, if they get the job done.

Suggestions?

Thanks.

----------

## JeliJami

maybe you can mount the destination directory with ssh, and tar/copy as if it were 'locally'

tools that can be used:

sshfs-fuse

shfs

lufs (if you're not running amd64)

non-tested code:

```
mkdir server

sshfs user@server:/home/user/backupdir server/

tar -cjvpf /path/to/files server/mybackup.tar.bz2

```

----------

## GNUtoo

you can also tunnel all the connection that you want with ssh...

but i never tryed it

----------

## Hu

 *rev138 wrote:*   

> I have another question:
> 
> I'm also trying to stream a mysql dump over ssh. The following code works, but I'd like to compress the data dump before it's tunnelled through ssh, for obvious reasons.
> 
> ```
> ...

 

Try:

```
 $ mysqldump -u user --password=password database | gzip | ssh user@server "cat >database.sql.gz"
```

Note that in your proposed command line, you have a useless use of cat.  Since gzip reads from stdin anyway, asking cat to copy stdin to stdout is a waste of a process and, well, useless.  :Smile: 

----------

## rev138

Thanks Hu, that works like a charm  :Smile: 

And thanks for the tip on "cat"  :Wink: 

----------

## Mad Merlin

SSH can apply compression transparently to any stream, see the ssh -C option.

----------

## GNUtoo

 *Mad Merlin wrote:*   

> SSH can apply compression transparently to any stream, see the ssh -C option.

 

i thought that it was automatic when adding the 

```
Compression yes
```

 parameter

----------

## Mad Merlin

 *GNUtoo wrote:*   

>  *Mad Merlin wrote:*   SSH can apply compression transparently to any stream, see the ssh -C option. 
> 
> i thought that it was automatic when adding the 
> 
> ```
> ...

 

Those parameters are one and the same, except that one goes in the config file and ones goes on the command line.

----------

## nom de plume

Anyone got some fancypants ideas on how to do this with incremental backups?

----------

## bunder

backuppc uses tar, and can use ssh, and also does incremental backups.  maybe their scripts would be useful.   :Wink: 

cheers

----------

## apryan

Im using rdiff-backup which uses ssh and scp to transfer files. It works like a charm.

-anthony

----------

## GNUtoo

there is also another more complex option

tar->untar

rsync on the untared files

retar...

----------

## nobspangle

why not just use rsync

Rsync by default streams over ssh, you only transfer changes in data so data transfer is at a minimum, also the stream is compressed by gzip. You can also get the system to make backups of the changed files.

I had a great script I set up for a server, it backed up every night and kept 4 weeks worth of increments.

----------

## rev138

I'd like the data on the backup server to be compressed, to save disk space. Is this possible with rsync?

----------

## Casshan

What you are probably looking for is http://backuppc.sourceforge.net/

This will compress the data when its stored on the server, lets you access old versions to "go back in time". I use it to backup 30+ different servers over WAN links and it works fine. You can do rsync and/or tar to do the backups. There is some setup (not much tho) and its in portage

----------

## numeritos

How do you give login info when making a backup via ssh?

----------

## Hu

 *numeritos wrote:*   

> How do you give login info when making a backup via ssh?

 

You can specify the username on the command line.  If the backup is performed without a human present to supply the password, then ssh must be able to authenticate to the server on its own.  This is typically accomplished using a public key/private key setup.  See the man page for ssh-keygen and the section AUTHENTICATION in the ssh manpage.

Note that if ssh can authenticate without your help, then anyone who gains access to that private key can also authenticate without your help.  For this reason, you should restrict what the key can do.  See the sshd manpage about options which can be placed in the authorized_keys file.  Additionally, consider whether you need other protections, such as placing a password on the key, and having a human load the key into an ssh agent before the backups begin.  See the manpage for ssh-agent for details on how to use an agent to achieve a compromise between leaving the key unprotected and making it unusable in batch jobs.

----------

## JeliJami

for a HOWTO: http://en.gentoo-wiki.org/HOWTO_SSH_without_a_password

----------

## JeliJami

just stumbled on this one: http://gentoo-wiki.org/TIP_Fast_Copy

----------

