Find the answer to your Linux question:
Page 1 of 2 1 2 LastLast
Results 1 to 10 of 17
hello Does anyone know how I can make a shell script that checks every 2 seconds of a site is online with wget?...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Dec 2011
    Posts
    4

    shell script that check with wget of site is online every 2 seconds


    hello

    Does anyone know how I can make a shell script that checks every 2 seconds of a site is online with wget?

  2. #2
    Trusted Penguin Irithori's Avatar
    Join Date
    May 2009
    Location
    Munich
    Posts
    3,390
    May I ask, what you intend to do?

    For monitoring:
    1) 2 seconds is a very low value.
    Especially for a shell based solution.
    You might run into scaling issues.
    Imho, a 5min interval is suitable here.
    2) define "online"
    Is a 404 online? Probably not from a users point of view.
    But for a (dumb) wget monitor: yes, it is online. The webserver answered.

    For takeover:
    Some HA/Loadbalancing solutions keep status of the pool members.
    Non reacting members will then no longer get requests.
    You must always face the curtain with a bow.

  3. #3
    Just Joined!
    Join Date
    Dec 2011
    Posts
    4
    1) 5 min is also good, it was a example
    2) a 404 is not online (when de wget return nothing it is offline

  4. #4
    Trusted Penguin Irithori's Avatar
    Join Date
    May 2009
    Location
    Munich
    Posts
    3,390
    Ok, monitoring it is.

    Then I would suggest to not reinvent the wheel and instead choose one of the well established solutions:
    - Nagios
    - Incinga
    - Xymon
    - Munin

    There are others as well for about any scale and purpose.
    Personally, I like xymon
    You must always face the curtain with a bow.

  5. #5
    Just Joined!
    Join Date
    Dec 2011
    Posts
    4
    thx for your reply

    I'm not really searching for a hole application but only a small bash/shell script that do a simply check just for testing.

    I tried it but my script don't work

    # !/usr/local/bin/bash

    if wget -qO- SITE
    then
    echo "down"

    else
    echo "up"
    exit
    fi

  6. #6
    Just Joined!
    Join Date
    Nov 2011
    Location
    New Zealand
    Posts
    79
    You could try this,

    Code:
    #!/bin/bash
    
    if wget -q http://linuxforums.org
    then
    echo "up"
    
    else
    echo "down"
    exit
    fi
    test it with a non-existent domain too such as drwwwww.org

  7. #7
    Trusted Penguin Irithori's Avatar
    Join Date
    May 2009
    Location
    Munich
    Posts
    3,390
    Nitpicker:
    - Indentation improves readability
    - This wget would download and save a index.html, then index.html.1, index.html.2.
    So you would need something like -O- <URL> >/dev/null


    But as I said earlier:
    This is NOT a test: Website is up/down.
    This is just a test: Webserver is running.

    To check the website, one could use a URLs, that also need e.g. DBs or an internal search engine.
    Then check the header for http return codes AND correct content.
    Only then can one be fairly sure, that the site is operational.

    Additonal hint:
    It makes sense to run this check from at least a different machine, and best from an outside network.
    You must always face the curtain with a bow.

  8. #8
    Trusted Penguin Irithori's Avatar
    Join Date
    May 2009
    Location
    Munich
    Posts
    3,390
    Additional nitpick
    bash is not neccessarily in /bin

    This shebang is better
    Code:
    #!/usr/bin/env bash
    You must always face the curtain with a bow.

  9. #9
    Just Joined!
    Join Date
    Nov 2011
    Location
    New Zealand
    Posts
    79
    Quote Originally Posted by Irithori View Post
    Nitpicker:
    - Indentation improves readability
    - This wget would download and save a index.html, then index.html.1, index.html.2.
    So you would need something like -O- <URL> >/dev/null
    It's a few lines of code and no ones gonna get too confused eh, any particular style of indentation?

    Yeah, redirect it to the garbage bin - i missed it out


    Quote Originally Posted by Irithori View Post
    But as I said earlier:
    This is NOT a test: Website is up/down.
    This is just a test: Webserver is running.
    The poster seems to be after a simple "doe's it respond test" so lets keep it simple...

    You would have to test an entire website content to make sure it was ok - even static pages, they could be compromised. If the poster wants simple, give him what he wants and let him explore if he needs to at a later date.

    Quote Originally Posted by Irithori View Post
    Additonal hint:
    It makes sense to run this check from at least a different machine, and best from an outside network.
    How about every machine on the net

    Seriously, yes that's a good call.

  10. #10
    Just Joined!
    Join Date
    Nov 2011
    Location
    New Zealand
    Posts
    79
    Quote Originally Posted by Irithori View Post
    Additional nitpick
    bash is not neccessarily in /bin

    This shebang is better
    Code:
    #!/usr/bin/env bash
    You could do away with the whole shebang entirely as it will still work but, not best practice I guess.

Page 1 of 2 1 2 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •