Find the answer to your Linux question:
Results 1 to 4 of 4
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1

    How can I prevent the offline browsing webspider to download my website content


    I am concerning how to prevent the offline browsing webspdier (e.g. Teleport) to download my website content for security purpose.

    I have add below content in the .htaccess file but I have tried to use Teleport, it still can download my website content.

    RewriteEngine On
    RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro
    RewriteRule /*$ [L,R]
    Is there any thing wrong setting in .htaccess file?

    Please check and give advice.

    Thanks and regards,


  2. #2


    You cannot prevent offline browsing. Period. It is extremely easy for an offline browsing program to spoof a web browser and download whatever it pleases. Furthermore, browsers such as mozilla-firefox allow you to save content to your hard disk after it has been downloaded for viewing. Oh, and how does saving the information offline compromise security? They have already seen it after all.

  3. #3
    Linux Engineer Zelmo's Avatar
    Join Date
    Jan 2006
    Riverton, UT, USA
    If I understand you right, what you might be looking for is a properly set up robots.txt file (see this link for more info about robots).
    Stand up and be counted as a Linux user!

  4. $spacer_open
  5. #4
    xD That will only prevent robots that conform to the standard! :P

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts