Find the answer to your Linux question:
Results 1 to 7 of 7
hi all i am hoping to block access to all websites but say moshimonsters and weebles, with the option of adding more in the future if so needed. i am ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Aug 2011
    Location
    UK NE
    Posts
    7

    Question blocking all but 1 or 2 websites


    hi all

    i am hoping to block access to all websites but say moshimonsters and weebles, with the option of adding more in the future if so needed.

    i am an absolute beginner so makes sense to post it here.

    I have looked through other posts but as I am so green i dont really no what to search for in the first place.

    so far i have come accross terminal code, gnome nanny, Dansguardian and various combinations with squid.
    can u think of anymore?? in order to make the computer safe for my kids. it will only be used for games and moshi and weebles.

    what i really would like is help to narrow down my options.
    then help on how to.
    thank you.:-k

  2. #2
    Just Joined!
    Join Date
    Aug 2011
    Location
    UK NE
    Posts
    7
    Re: please advice will it work or will it do damage?


    just taking a look at squib at the moment
    i was planning on combining info from two sources does anybody know if this would work.
    what i write below is not sure to work so please dont use it unless someone says it is good as i am a noob and just grasping at straws. lol

    the first is of ubuntu and was planning to use some with some of the info on linuxquestions this is how it is on there, but below is how i was planning to use this information for my kids pc.
    please advice

    .ubuntugeek.com/how-to-set...in-ubuntu.html
    Install Squid

    Install squid and squid-common

    sudo aptitude install squid squid-common

    Edit the squid config file.

    sudo vi /etc/squid/squid.conf

    Set the allowed hosts.

    acl internal_network src 192.168.0.0/24 (Where 192.168.0.0/24 is your IP range.)
    http_access allow internal_network

    Set the correct permissions.

    sudo chown -R proxyroxy /var/log/squid/
    sudo chown proxyroxy /etc/squid/squid.conf

    You will need to restart squid for the changes to take affect.

    sudo /etc/init.d/squid restart

    Now open up your browser and set your proxy to point to your new squid server on port 3128

    Authentication

    If you wish to use authentication with your proxy you will need to install apache2 utilities

    sudo aptitude install squid squid-common apache2-utils

    To add your first user you will need to specify -c

    sudo htpasswd -c /etc/squid.passwd first_user

    Thereafter you add new users with

    sudo htpasswd /etc/squid.passwd another_user

    Edit the squid config file

    sudo vi /etc/squid/squid.conf

    Set the the authentication parameters and the acl

    auth_param basic program /usr/lib/squid/ncsa_auth /etc/squid.passwd
    auth_param basic children 5
    auth_param basic realm NFYE Squid proxy-caching web server
    auth_param basic credentialsttl 3 hours
    auth_param basic casesensitive off

    acl users proxy_auth REQUIRED

    acl sectionx proxy_auth REQUIRED

    http_access allow users

    So this is what your squid.conf should look like.

    acl all src 0.0.0.0/0.0.0.0
    acl internal_network src 192.168.0.0/24
    acl users proxy_auth REQUIRED
    acl manager proto cache_object
    acl localhost src 127.0.0.1/255.255.255.255
    acl to_localhost dst 127.0.0.0/8
    acl SSL_ports port 443 563 # https, snews
    acl SSL_ports port 873 # rsync
    acl Safe_ports port 80 # http
    acl Safe_ports port 21 # ftp
    acl Safe_ports port 443 563 # https, snews
    acl Safe_ports port 70 # gopher
    acl Safe_ports port 210 # wais
    acl Safe_ports port 1025-65535 # unregistered ports
    acl Safe_ports port 280 # http-mgmt
    acl Safe_ports port 488 # gss-http
    acl Safe_ports port 591 # filemaker
    acl Safe_ports port 777 # multiling http
    acl Safe_ports port 631 # cups
    acl Safe_ports port 873 # rsync
    acl Safe_ports port 901 # SWAT
    acl sectionx proxy_auth REQUIRED
    acl purge method PURGE
    acl CONNECT method CONNECT

    http_access allow manager localhost
    http_access allow users
    http_access allow internal_network
    http_access deny manager
    http_access allow purge localhost
    http_access deny purge
    http_access deny !Safe_ports
    http_access deny CONNECT !SSL_ports
    http_access allow localhost
    http_access deny all
    http_reply_access allow all
    icp_access allow all

    Redirect the all HTTP traffic.

    If you would like to redirect the all HTTP traffic through the proxy without needing to set up a proxy manually in all your applications you will need to add some rules

    iptables -t nat -A PREROUTING -i eth1 -p tcp -m tcp --dport 80 -j DNAT --to-destination 192.168.0.1:3128
    iptables -t nat -A PREROUTING -i eth0 -p tcp -m tcp --dport 80 -j REDIRECT --to-ports 3128

    Where eth1,eth0 are the LAN, WAN devices and 192.168.0.1 is the IP address of your LAN device.

    If you wish to monitor the performance of your proxy you can look as some log parser’s (sarg, calamaris, ect.)





    .linuxquestions.org/questions/linux-networking-3/squid-to-block-all-sites-except-1-or-2-sites-573655/

    who ever asked for content filtering??? you appear to just be advertising your own project, not answering the question at hand.

    winxandlinx you can do exactly what you want very very simply with squid. there is no need whatsoever to use other tools in conjunction with it. you just want a dstdomain whitelist and permit everything in that whitelist and then follow up with a blanket deny after it.


    acl whitelist dstdomain .google.com .mycompany.com
    acl all src 0.0.0.0/0.0.0.0

    http_access permit whitelist
    http_access deny all




    here is what i was planning to do.
    Install Squid

    Install squid and squid-common

    sudo aptitude install squid squid-common

    Edit the squid config file.

    sudo vi /etc/squid/squid.conf

    acl whitelist dstdomain .moshimonsters.com .binweevils.com
    acl all src 0.0.0.0/0.0.0.0

    http_access permit whitelist
    http_access deny all


    will this work?
    does it need more adding?

    thank you

    ps will this make changes to to the admin account, as i want to still be able to use it to go onto the net
    pss will this stop me from downloading more games through ubuntu software centre

  3. #3
    Just Joined!
    Join Date
    Aug 2011
    Location
    UK NE
    Posts
    7

    just tried it :(

    got this at this point
    acl whitelist dstdomain .moshimonsters.com .binweevils.com .linuxforums.org ubuntuforums.org .ubuntuforums.org
    No command 'acl' found, did you mean:
    Command 'alc' from package 'amule-utils-gui' (universe)
    Command 'alc' from package 'amule-adunanza-utils-gui' (universe)
    Command 'cal' from package 'bsdmainutils' (main)
    Command 'ace' from package 'libace-perl' (universe)
    Command 'ack' from package 'ack' (universe)
    Command 'acm' from package 'acm' (universe)
    Command 'cl' from package 'cl-launch' (universe)
    Command 'mcl' from package 'mcl' (universe)
    Command 'ac' from package 'acct' (main)
    Command 'al' from package 'mono-devel' (main)
    Command 'gcl' from package 'gcl' (universe)
    Command 'bcl' from package 'bash-completion-lib' (universe)
    Command 'acl2' from package 'acl2' (universe)
    Command 'axl' from package 'afnix' (universe)
    acl: command not found

  4. $spacer_open
    $spacer_close
  5. #4
    Penguin of trust elija's Avatar
    Join Date
    Jul 2004
    Location
    Either at home or at work or down the pub
    Posts
    3,635
    Never tried this (don't have kids) but a possibly simpler way would be to use opendns (OpenDNS | DNS-Based Web Security)
    "I used to be with it, then they changed what it was.
    Now what was it isn't it, and what is it is weird and scary to me.
    It'll happen to you too."

    Grandpa Simpson



    The Fifth Continent

  6. #5
    Penguin of trust elija's Avatar
    Join Date
    Jul 2004
    Location
    Either at home or at work or down the pub
    Posts
    3,635
    Quote Originally Posted by callmebadger View Post
    cant understand with all the advertising and tracking on this website why is it so hard to post about links and websites
    It stops at least 99% of spam
    "I used to be with it, then they changed what it was.
    Now what was it isn't it, and what is it is weird and scary to me.
    It'll happen to you too."

    Grandpa Simpson



    The Fifth Continent

  7. #6
    Linux Newbie arespi's Avatar
    Join Date
    May 2011
    Location
    Monterrey , Mexico
    Posts
    152

    Another option with squid

    A simple option:

    a) Install squid
    b) install webmin Webmin

    Once webmin is up a running log in, navigate to the servers section and select Squid, there you can see an Access Control section. There you add a new proxy restriction allowing the sites you want and blocking all the rest.

    This is ok for a few sites but once your list starts to grow i would also install squidGuard and manage it also with webmin. It has a larger access control list by categories, ip ranges, etc.

    Good luck

  8. #7
    oz
    oz is offline
    forum.guy
    Join Date
    May 2004
    Location
    arch linux
    Posts
    18,733
    Hello and welcome to the forums, callmebadger!

    Just a quick note here to let you know that all of your 15-posts spam has been deleted. If needed, you can find some temporary workarounds for the 15-posts requirement for posting URLs here:

    http://www.linuxforums.org/forum/lin...-user-faq.html

    Thanks
    oz

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •