# [SOLVED]Reliable syncing with FTP site

## Havin_it

Hi,

I'm currently managing a website built using Joomla! and it now needs a security update applied (the first time I've faced this situation since uploading the site). I'm wondering what will be the best way to deploy the update and there are a few complications.

If you know Joomla! you'll know it comprises a very large set of files. Initially uploading the whole app to the site via FTP took almost an hour, also due to high latency in the server.

To complicate things more, I have made some customisations to the codebase, so to apply an upgrade I need to untar it in a separate folder on my own machine and use KDiff3 to manually compare each file before updating it. I'd then use unison to transfer the changes to my test-server (on my LAN) without overwriting any structural changes (new folders, uploaded images etc.) made within the app itself.

That approach is a non-starter with the live server, though, because the latency is so bad. Unison doesn't support FTP directly, so I tried using curlftpfs to mount the FTP directory locally and perform a unison sync between the two, but that soon led to a kernel panic  :Sad: 

Now, I don't mind if the process takes a long time, but making my computer fall over is to be avoided! Can anyone venture any thoughts on a way I could achieve this in a reliable and sane fashion?

Thanks in advance.Last edited by Havin_it on Fri Apr 15, 2011 12:24 pm; edited 1 time in total

----------

## avx

After comparing/making changes, zipping up the files, upload the zip and use some php-unzip script (google finds a lot of those)?

----------

## Havin_it

Isn't that just equivalent to recursive-copying the site folder though? The problem here is there's likely to be bi-directional changes (new files on the server too).

I can report that I've done the local diff on the upgrade, and that at least is all one-way, no conflicts with files I've edited myself.

----------

## avx

Well, that might depend on your server, imho. For example, my client regulary gets disconnected if I try to upload a large number (~>1000) of files via ftp, but it works for me when using scp or packaging them and extract them on the server.

----------

## Havin_it

Well, I guess can sync "down" via a CPanel-provided tarball of the site, do the sync locally, then re-upload it also using CPanel.

However, one thing I'm concerned about with that approach: when I download the tarball and look inside, the permissions are in the usernames of the remote system, and I'm not sure if it'd work uploading it back via cpanel if those have changed. Is there any way I can somehow preserve them during the local sync operation?

----------

## Havin_it

Well, I found a solution and it was already dangling under my nose   :Embarassed:   I use the Firefox extension FireFTP and poked in its Tools menu and found it has a recursive directory-sync function   :Very Happy: 

It's quite basic: it just looks for files missing on either side, or different-size files (it can compare timestamps too, but I haven't tried that yet and suspect it would be a performance hit), but as long as I know what I'm looking for it does the job.

I should really get to know my own software better before trying to reinvent the wheel, I guess!

----------

