Find the answer to your Linux question:
Results 1 to 8 of 8
I am planning to configure a caching proxy server for a system of 4000 computers. According to my research I found that squid is the most popular open source proxy ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Sep 2009
    Posts
    4

    Performance of squid+squidguard


    I am planning to configure a caching proxy server for a system of 4000 computers.

    According to my research I found that squid is the most popular open source proxy server available. And squidguard can be used with squid for URL filtering/content filtering,,,,

    I have never used squid and squidguard before...

    Can anyone please tell me Whether a proxy server running squid+squidguard support 4000 users?

    I really appreciate your help..

    kind regards,,

    Krox88

  2. #2
    Just Joined! Tarthen's Avatar
    Join Date
    Sep 2009
    Location
    Australia
    Posts
    40
    Squid is used by ISP's, so it's theoretically possible - but I would not assign one box to do it all. I would use a cluster, possibly spread across ~5 or so boxes/VM's in different locations around the network, so that latencies are kept nice and low. It's possible to set Squid so that it has "parent" and "sibling" nodes - So that if it goes to the closest one, if it has the page, it gets it, if not, it asks it's other nodes. and therefore you get maximum efficiency. Squid is very scalable, in my opinion.

    What kind of network is it? Multiple sites? One, large building? Google :P ? 4,000 is a lot to me - I've only had to manage up to 70 PC's with my Squid.

  3. #3
    Just Joined!
    Join Date
    Sep 2009
    Posts
    4
    hi tarthen.

    Thanks for your reply..
    This proxy is for a hospital. the computers are spread accross multiple sites.
    And all the computers are connected to the internet through the IT department in the main hospital.

    The efficiency of the present proxy is good. It uses squid aswl. The problem is poor performance in URL and content filtering.Mainly it cant filter the content in the web page.

    so thats why i thought of using squidguard to improve the performance. It should be a opensource software because I can do a bit of modification.

    I thought of using squidguard because it can be combined with blacklists,whitelists, and expressionlists, and provide LDAP user authentication. But I am not sure it can handle a load of 4000 computers. I couldnt find any information on the internet about this.

    If anyone has done this before it wold be great to get some advise and ideas of solving this problem.

  4. #4
    Just Joined! Tarthen's Avatar
    Join Date
    Sep 2009
    Location
    Australia
    Posts
    40
    Seeing as it's commercial and something that needs to be up, look into DansGuardian. Costs money for buisnesses, but well worth it, I've heard.

  5. #5
    Just Joined!
    Join Date
    Sep 2009
    Posts
    4
    yeah i ve thought about it too. it is possible to combind both squidguard and dansguardian. i ve come accross as article about tht.

  6. #6
    Just Joined! Tarthen's Avatar
    Join Date
    Sep 2009
    Location
    Australia
    Posts
    40
    But, the more layers of filtering, the slower it is.

    At work I made it accept three ports in: 80, which is normal (I just use the builtin filters and read the squid.log every so often), 8080, which has no filtering at all, and 3128, which has a filter that goes straight to a "your banned" page. This works rather nicely, as you can give the higherups and your department no filtering but the benefits of caching, and truly ban the users who abuse it. I made the redirections with a startup .vbs pushed out by group policy - it just checks if they're a member of a group called "G_ProxyAllow" or "G_ProxyDeny", and sets IE's settings accordingly.

    If you have Ciscos, you can do some cool stuff with them. Like, load balancing between boxes (I think) and automatic HTTP redirection to the PC's connected to that router. WCCP, I think it's called.

    And if you're doing a "transparent" (no config), use the FakeAuth redirector. Works a treat - you can log without having to provide a password. It just okays everything that comes into it, so you can log on to local boxes/Linux boxes and still get the name of them.

  7. #7
    Just Joined!
    Join Date
    Sep 2009
    Posts
    4
    do u have any idea of filtering the content in the webpage. Its basically blocking web pages with bad words. This option is not available with squidguard.
    thanx for the earlier tips about load balancing.
    I am planing to authenticate users by using a LDAP server.

  8. #8
    Just Joined! Tarthen's Avatar
    Join Date
    Sep 2009
    Location
    Australia
    Posts
    40
    Hmm. I would still go with fakeauth (it still asks for authentication, but just gives it the go without checking) rather than LDAP, because this would make it faster. But, no I don't know about content filtering, however, DansGuardian does it. If your looking to go completely free though, a self-written bash script that gets it by cURL (through the proxy of course, so that it caches it, and "preloads"), checks it from a predefined list of words/phrases, and then if it's okay, sends the AOK signal, otherwise redirects it to a denied page. Not sure if this would make it a lagfest or not.

    But the thing with fakeauth is that it's easy to spoof being another user if you know what you're doing - but if you use clients like IE, that shouldn't be a problem, as you can lock the proxy settings. However, if it's a Linux/Unix shop, FF has the ability to log into proxies as another user.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •