# (mini HOWTO) urlblacklist.com blacklists with Squid proxy

## maiku

As we all know Dansguardian and Squidguard are both really awesome solutions that will use the blacklist URL from urlblacklist.com to be able to block websites based on categories. Sonicwalls and other such devices have this, so why can't we? We do now!

The problem with Dansguardian and Squidguard for me was that they sort of take over and want to be in front of Squid's advanced ACLs. That posed a problem since I wanted to have more control and use other external ACL scripts at the same time. So my solution was to create yet more external ACL scripts.

These two scripts are merge_blacklist which downloads the latest blacklist file from urlblacklist.com and then commits it to a MySQL database. Then there is also url_lookup which is run as a Squid external ACL that looks up the URL in the database.

Prerequisites:

By this point I'm assuming you have MySQL set up and working as well as Squid set up and ready to work.

The scripts:

The script to download from urlblacklist.com: http://www.mikealeonetti.com/files/merge_blacklist

The Squid external ACL script: http://www.mikealeonetti.com/files/url_lookup

merge_blacklist

Set the MySQL database info as you see fit or use my default and create the table as I do below:

 *Quote:*   

> my $mysql_host = 'localhost';
> 
> my $mysql_user = 'squidaccess';
> 
> my $mysql_pass = 'squidaccess';
> ...

 

Also note, when testing uncomment the testing line below with "smalltestlist" in the URL. Using the bigblacklist for testing is against urlblacklist.com etiquette.

 *Quote:*   

> #my $blacklist_url = "http://urlblacklist.com/cgi-bin/commercialdownload.pl?type=download&file=bigblacklist";
> 
> my $blacklist_url = "http://urlblacklist.com/cgi-bin/commercialdownload.pl?type=download&file=smalltestlist";

 

url_lookup

The same thing as above for MySQL:

 *Quote:*   

> my $mysql_host = 'localhost';
> 
> my $mysql_user = 'squidaccess';
> 
> my $mysql_pass = 'squidaccess';
> ...

 

I'd recommend throwing url_lookup in /usr/local/bin and keeping merge_blacklist in a home directory since merge_blacklist will download the blacklist to the CWD. It should be able to run in a cron also. chmod both scripts to 755.

Create the MySQL database

 *Quote:*   

> # mysql
> 
> mysql> CREATE DATABASE squidaccess;
> 
> mysql> GRANT ALL PRIVILEGES ON squidaccess.* TO squidauth@localhost IDENTIFIED BY 'squidaccess';
> ...

 

Squid

Now make these changes in the /etc/squid/squid.conf: *Quote:*   

> # Note that where it says chat,porn replace with a comma separated list WITHOUT SPACES
> 
> # of all of the categories you wish to block.
> 
> external_acl_type urlblacklist_lookup ttl=5 %URI /usr/local/bin/url_lookup chat,porn
> ...

 

Note: keep in mind that localnet should be either defined with your localnet or replaced with the acl IP src that you want. For example, my localnet is defined as: *Quote:*   

> acl localnet src 192.168.1.0/24

 And it should be defined above in your squid.conf where you reference it. 

To finalize

Run the merge_blacklist script. If you get no errors then restart Squid and it should be working just fine.

The full HOWTO

Updates for this HOWTO will be maintained at: http://www.mikealeonetti.com/wiki/index.php/Using_urlblacklist.com_with_Squid_without_Dansguardian_or_Squidguard

Feedback

Please give as much feedback as you can and bug testing is VERY appreciated. I would like to make this a viable alternative not only to Dansguardian and Squidguard, but also to other solutions like the fancy Sonicwalls and other routers like them.

Thanks!

----------

