# Saving Bandwidth ( using squid?, ssh? both? )

## BrummBrumm

Hello.

I am using a Vodafone-UMTS-Connection which is currently limited to 5 GB per month with my notebook. As it would be very expensive, I'm trying to avoid exceeding this limit.

As I got a server connected to the internet i thought i can make use of it to save some bandwidth using a compressing ssh-tunnel (with the -D option ssh offers).

That far, everything works. My idea now is to run a instance of squid locally to do some caching and let squid use the compressed tunnel whenever it is available ( maybe by creating that tunnel via public-key-authentication in /etc/conf.d/net with postup() )

I know that the compression in a ssh-tunnel only has a very small impact on already compressed data (such as youtube-videos or jpg-files ), and that firefox has some build-in cache, so here are my questions:

Is it a good idea in general? Is it possible?

Would this concept save bandwidth (as i am the only one using squid and firefox already do cache things, with respect to the fact that the cache could get filled when i use other connections than mine )?

If it is a good idea:

I played around a bit with squid and its options to build a proxy-hierarchy (in which IMHO the socks-proxy provided by ssh should be the parent ) and couldn't get to working.

Using only the ssh-tunnel works, using only squid works, but after playing around a while with never_redirect, cache_peer, nonhierarchical_direct and prefer_direct i could not combine them.

If i force squid to use the socks-proxy, i get error-messages on every page i try to load with firefox. If i do not force this (never_direct), squid just connects directly to the webservers ( as reported by netstat ), but there are some connections on the ssh-tunnel and squid reports if the tunnel is open or closed 

( Detected DEAD Parent: 127.0.0.1           Detected REVIVED Parent: 127.0.0.1          Configuring Parent 127.0.0.1/8080/0 ) 

Any suggestions? 

Thanks

----------

## eccerr0r

I find that if I never download with bittorrent or downloading files, I'd have a hard time reaching over 2GB or so per month on even my static IP DSL.  Unfortunately these downloadings are one-time-only and caches do not help.

Due to the so called "Amdahl's Law" compressing the already small portion of bandwidth utilization doesn't really help much over the grand scheme of things.  And even worse, usually the biggest portion of http traffic is downloading pictures -- jpeg -- and those don't compress well either.

Some possible suggestions are to use Opera and have them compress images for you.  I actually have a Squid proxy/cache and I'm not sure what the hit rate is really -- the browser already does some caching (increase the local cache size to as big as you can, of course), and Squid really was meant for if multiple people that have separate local caches was sharing a connection.

Another thing I do for http traffic is when I connect remotely, sometimes I ssh back to my machine and forward the squid proxy back to my machine with a compressing link.  I don't really do this for compression as due to the above it's not that helpful, it's mainly for encryption -- but the compression does save some bandwidth.

I'd look into what you're downloading.  If it's mostly pictures/bittorrent/portage distfiles, there's not really that much that can be done - you're stuck with downloading those and the small text/.html pages that compress well aren't going to help much overall.

The only exception to this is if you get charged by the byte.  Then perhaps reducing it down might be helpful, but overall it's still not going to save much.

(Thank goodness for unlimited/unmetered connections... but technically I should look into something for my other pseudo-metered UTMS/HSDPA connection that I don't think I'll ever hit 5GB/month...)

----------

## frostschutz

Most HTTP connections are already gzip compressed nowadays, so in such a case (and for images and the like) the compressed SSH tunnel may actually cause more overhead than it saves. So maybe you should actually have a Squid server side that handles the traffic to HTTP hosts, and recompresses HTTP with maximum compression (or leave it to SSH) when going from your server to your notebook.

Squid is definitely a good idea locally as well, as is upping the browser disk cache. I'm not sure whether there are any browser plugins available for Firefox (or whether Squid can be configured) so that it would cache content longer. For me, doing stuff like reading web comics regularly kicks out small fry like forum icons and smileys out of the cache, very annoying...

Disabling JavaScript by default (and only enabling it for whitelisted sites) could save you possibly lots of unnecessary AJAX requests. Also you should obviously use Adblock with a restrictive ruleset so as to not waste bandwidth on ads. For extreme measure, you could disable image loading completely (whitelist for specific sites). Many sites are way too graphic intensive nowadays.

An entirely different solution would be checking whether there isn't any carrier that offers unlimited internet for cheap, maybe with reduced bandwidth? Not sure about the mobile internet situation where you live but in Germany there are such offers available starting $10/month or so. So you could transfer low priority stuff over that cheap line and use the expensive one only for high priority...

----------

## BrummBrumm

Hi.

I already use AdBlock (plus) and NoScript and I am not into filesharing at all. I never reached the 5GB limit too, just about 1.5 GB last month for example. 

I have tried the compressing tunnel for a few days now and it reports (-vv) compression-ratios like 0.1 outgoing, 0.5 to 0.9 on incoming traffic when just surfing with firefox. That does not include updating the portage-tree or fetching updates. 

After all, i think i drop this 'project'. There really seems to be no reasonable benefit... Thanks for the responses! 

But just for curiosity: How do i get this 'proxy-chain' working?

----------

