Find the answer to your Linux question:
Results 1 to 9 of 9
I am running squid 3 and authenticating Active Directory users transparently. This is working well. I am using sarg via webmin to view my squid reports. The last thing I ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Dec 2008
    Posts
    21

    squidguard webmin


    I am running squid 3 and authenticating Active Directory users transparently. This is working well. I am using sarg via webmin to view my squid reports. The last thing I need to do is to get squidguard working.

    So far everytime I make a rule whether it's a source rule or destination, all access becomes blocked and my clients recieve 'Proxy not accepting requests'

    If I delete the rule and restart squid, I still cannot access anysites at all. To get around this I remove the line

    redirect_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf

    and restart squid then I can browse again.

    any suggestions?

  2. #2
    Linux Engineer jledhead's Avatar
    Join Date
    Oct 2004
    Location
    North Carolina
    Posts
    1,077
    are any errors showing up in the logs, squid or squidguard?

    post the contents of your squidguard config file

  3. #3
    Just Joined!
    Join Date
    Dec 2008
    Posts
    21
    On squidguard, I have made a source group and added an IP range to it.

    I have downloaded a blacklist and its at /var/lib/squidguard/db/blacklists

    My path is /var/lib/squidguard/db

    I have made an acl called students

    nothing is being blocked but looking at the logs it looks like it is not reading the files. Below are the logs and squidguard.conf file

    squidguard.conf

    #
    # Configuration File for SquidGuard
    #
    # Created with the SquidGuard Configuration Webmin Module
    # Copyright (C) 2001 by Tim Niemueller <tim@niemueller.de>
    # niemueller.de - webmin modules - SquidGuard Configuration
    #
    # File created on 15/Dec/2008 10:34
    #

    dbhome /var/lib/squidguard/db
    logdir /var/log/squid

    source students {
    ip 172.19.100.1-172.19.100.254
    }

    destination bl_porn {
    }

    acl {
    students {
    }


    default {
    pass all
    }
    }


    and logs of squidguard.log


    2008-12-15 11:13:24 [8216] destblock bl_porn missing active content, set inactive
    2008-12-15 11:13:24 [8216] squidGuard 1.2.0 started (1229339604.056)
    2008-12-15 11:13:24 [8216] squidGuard ready for requests (1229339604.056)
    2008-12-15 11:13:24 [8218] destblock bl_porn missing active content, set inactive
    2008-12-15 11:13:24 [8217] destblock bl_porn missing active content, set inactive
    2008-12-15 11:13:24 [8218] squidGuard 1.2.0 started (1229339604.057)
    2008-12-15 11:13:24 [8217] squidGuard 1.2.0 started (1229339604.057)
    2008-12-15 11:13:24 [8218] squidGuard ready for requests (1229339604.05
    2008-12-15 11:13:24 [8217] squidGuard ready for requests (1229339604.05
    2008-12-15 11:13:24 [8219] destblock bl_porn missing active content, set inactive
    2008-12-15 11:13:24 [8219] squidGuard 1.2.0 started (1229339604.060)
    2008-12-15 11:13:24 [8219] squidGuard ready for requests (1229339604.060)
    2008-12-15 11:13:24 [8220] destblock bl_porn missing active content, set inactive
    2008-12-15 11:13:24 [8220] squidGuard 1.2.0 started (1229339604.061)
    2008-12-15 11:13:24 [8220] squidGuard ready for requests (1229339604.062)

  4. #4
    Just Joined!
    Join Date
    Dec 2008
    Posts
    21
    I have changed my database directory to read /var/lib/squidguard/db/blacklists

    This seems to clean up squidguard.log a bit, it now reads


    2008-12-15 12:24:38 [6479] init domainlist /var/lib/squidguard/db/blacklists/porn/domains
    2008-12-15 12:24:39 [6481] init domainlist /var/lib/squidguard/db/blacklists/porn/domains
    2008-12-15 12:24:39 [6480] init domainlist /var/lib/squidguard/db/blacklists/porn/domains
    2008-12-15 12:24:39 [6482] init domainlist /var/lib/squidguard/db/blacklists/porn/domains
    2008-12-15 12:24:39 [6483] init domainlist /var/lib/squidguard/db/blacklists/porn/domains
    2008-12-15 12:27:47 [6480] init urllist /var/lib/squidguard/db/blacklists/porn/urls
    2008-12-15 12:27:48 [6480] squidGuard 1.2.0 started (1229343879.022)
    2008-12-15 12:27:48 [6480] squidGuard ready for requests (1229344068.856)
    2008-12-15 12:27:53 [6481] init urllist /var/lib/squidguard/db/blacklists/porn/urls
    2008-12-15 12:27:54 [6481] squidGuard 1.2.0 started (1229343879.00
    2008-12-15 12:27:54 [6481] squidGuard ready for requests (1229344074.012)
    2008-12-15 12:27:59 [6479] init urllist /var/lib/squidguard/db/blacklists/porn/urls
    2008-12-15 12:28:00 [6479] squidGuard 1.2.0 started (1229343878.987)
    2008-12-15 12:28:00 [6479] squidGuard ready for requests (1229344080.590)
    2008-12-15 12:28:01 [6483] init urllist /var/lib/squidguard/db/blacklists/porn/urls
    2008-12-15 12:28:01 [6483] squidGuard 1.2.0 started (1229343879.123)
    2008-12-15 12:28:01 [6483] squidGuard ready for requests (1229344081.819)
    2008-12-15 12:28:01 [6482] init urllist /var/lib/squidguard/db/blacklists/porn/urls
    2008-12-15 12:28:02 [6482] squidGuard 1.2.0 started (1229343879.042)
    2008-12-15 12:28:02 [6482] squidGuard ready for requests (1229344082.30


    Have a look at my squidguard.conf file because now I can seem to access anything

    #
    # Configuration File for SquidGuard
    #
    # Created with the SquidGuard Configuration Webmin Module
    # Copyright (C) 2001 by Tim Niemueller <tim@niemueller.de>
    # niemueller.de - webmin modules - SquidGuard Configuration
    #
    # File created on 15/Dec/2008 12:12
    #

    dbhome /var/lib/squidguard/db/blacklists
    logdir /var/log/squid

    source students {
    ip 172.19.100.1-172.19.100.254
    }


    destination bl_porn {
    log porn
    domainlist porn/domains
    urllist porn/urls
    }

    acl {
    students {
    }


    default {
    pass all
    }
    }

    I have added news.bbc.co.uk into porn/urls and porn/domains but still they are not being blocked

  5. #5
    Linux Engineer jledhead's Avatar
    Join Date
    Oct 2004
    Location
    North Carolina
    Posts
    1,077
    Quote Originally Posted by insurin View Post

    default {
    pass all
    }
    }
    pass all means pass all, so nothing is being blocked. it looks like your blacklists are all loaded fine otherwise squidguard would drop to emergency mode (in the logs)
    Code:
    {
    		default {
    			pass !gambling !warez all
    			redirect 302:http://www.google.com
    		}
    the ! is equal to not, so don't pass. so that line reads pass not gambling, not warez, but everything else and redirect the not's to Google

    let me know if that helps.

    and IMO, since these are downloaded blacklists, if you redownload them it will overwrite any changes you made. so create a seperate blacklist call yourblocks and a seperate whitelist called whitelist (whatever names you want) and follow the same folder structure and file structure as whats in the other db folders. then add your blocks to the yourblocks db and things you want to always pass to your whitelist and configure squidguard as such.

  6. #6
    Just Joined!
    Join Date
    Dec 2008
    Posts
    21
    thanks mate, that has worked a treat. I had to do a squidguard -C domains/urls. This changed all the files to a db extension. I also changed ownership back to proxy

    Now the bad sites are being stopped. While I have some expert advice, I would like to pick your brains if I may.

    Why have we done this under default and not made a acl and why would we have multiple acls

    Since I have converted the domains/urls to .db extensions I can no longer just edit them and bang in a new url. Does this mean that I have to edit the domains/urls files with the .txt extension, edit them and then convert again?

  7. #7
    Linux Engineer jledhead's Avatar
    Join Date
    Oct 2004
    Location
    North Carolina
    Posts
    1,077
    Quote Originally Posted by insurin View Post
    thanks mate, that has worked a treat. I had to do a squidguard -C domains/urls. This changed all the files to a db extension. I also changed ownership back to proxy

    Now the bad sites are being stopped. While I have some expert advice, I would like to pick your brains if I may.

    Why have we done this under default and not made a acl and why would we have multiple acls

    Since I have converted the domains/urls to .db extensions I can no longer just edit them and bang in a new url. Does this mean that I have to edit the domains/urls files with the .txt extension, edit them and then convert again?
    you have to load all the blocklists and with an acl a seperate entry for passing or blocking. I don't have a server to test against but I think the "default" could be anything. in your config you defined students as a source, so instead of default you could say students and then block or pass the way I showed above and then have another entry called default that can be even more restrictive to catch people not defined in your defined ip range.

    and yes, you need to edit the txt file and then convert again. I know you have had issues using webmin to mange squid, but I used to use the squidguard webmin module to edit the blocklists (or whitelists) and it would take care of that.

    any other questions just ask away. I am guessing since squidguard is working that squid is working, maybe mark your other post as solved.

  8. #8
    Just Joined!
    Join Date
    Dec 2008
    Posts
    21
    just with regards to the logging, I have created an extra file called porn i.e.

    destination bl_porn {
    log porn
    domainlist porn/domains
    urllist porn/urls
    }

    this shows up in /var/log/squid/porn.log

    If I leave the log option blank, would this mean it logs to the global setting of /var/log/squid/access.log

    When viewing the report via the sarg module I am not seeing sites that are blocked or denied, it is just showing up as a site that has been accessed. For example, if I set the redirect to google.co.uk then in the log it does not show me the attempt to access the bad site but rather the legitimate site of google.co.uk, therefore I can not find out what bad site they have tried to access.

  9. #9
    Just Joined!
    Join Date
    Dec 2008
    Posts
    21
    going back to thsi format

    destination bl_porn {
    log porn
    domainlist porn/domains
    urllist porn/urls
    }

    which creates a file called porn.log

    when checking this log I get

    2008-12-15 14:54:12 [10942] Request(default/porn/-) BBC NEWS | News Front Page 172.19.100.37/- dummy.username GET

    So the data is there but when using sarg for the nice pretty view, it shows 'GET' as the username and 'dummy.username' for the accessed site

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •