Find the answer to your Linux question:
Page 2 of 2 FirstFirst 1 2
Results 11 to 19 of 19
Like Tree2Likes
here is the squish.pl file and rrdsquish file....
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #11
    Just Joined!
    Join Date
    Sep 2012
    Posts
    14

    squish.pl file


    here is the squish.pl file and rrdsquish file.
    Attached Files Attached Files
    Last edited by sureshk; 09-28-2012 at 11:13 AM.

  2. #12
    Trusted Penguin
    Join Date
    May 2011
    Posts
    4,353
    Quote Originally Posted by sureshk View Post
    Code:
    Can't locate rrdsquish.pm in @INC (@INC contains: /usr/local/lib64/perl5 /usr/local/share/perl5 /usr/lib64/perl5/vendor_perl /usr/share/perl5/vendor_perl /usr/lib64/perl5 /usr/share/perl5 .) at /usr/local/squish/squish.pl line 25.
    BEGIN failed--compilation aborted at /usr/local/squish/squish.pl line 25.
    okay, that error is saying that squish.pl is looking for squish.pm in the same dir, but can't find it. so in your cron script, try to chdir into the dir where both those files are located.

    here's an example:

    Code:
    #!/bin/bash
    
    # define path to squish perl script here
    squishScript=/path/to/squish.pl
    
    cd $(dirname $squishScript)
    squishProg=$(basename $squishScript)
    
    for log in $(find /var/squid/logs /var/log/squid /usr/local/squid/var/logs -name "access.log"); do
      printf "\nLOG: $log\n"
      cat $log | ./$squishProg
    done
    make these changes to a copy of your cron script first, and run that copy from the command line first, to make sure it works. then try it as a cron job.
    sureshk likes this.

  3. #13
    Just Joined!
    Join Date
    Sep 2012
    Posts
    14
    With the changes of above, Now it runs fine, Thank you very much.
    out put of program log is:
    Code:
    LOG: 
    /var/log/squid/access.log
    ./squish.pl: 1024 lines squished
    ./squish.pl: 539 excluded
    but the thing is its not being updated on the browser details where users have to got exceeded message (through cron job. )
    the user database also being updated at /var/lib/squish/userdb.stor
    but not displaying any error message to the users which they have to got exceeded message.
    So I run it manually by stopping at cron job.
    then its being updated at browsers also and got exceed message.
    is there anything I have to change at squish.cgi or any other problem.
    Attached Files Attached Files

  4. $spacer_open
    $spacer_close
  5. #14
    Trusted Penguin
    Join Date
    May 2011
    Posts
    4,353
    Quote Originally Posted by sureshk View Post
    but the thing is its not being updated on the browser details where users have to got exceeded message (through cron job. )
    the user database also being updated at /var/lib/squish/userdb.stor
    but not displaying any error message to the users which they have to got exceeded message.
    So I run it manually by stopping at cron job.
    then its being updated at browsers also and got exceed message.
    is there anything I have to change at squish.cgi or any other problem.
    i am not sure i understand how the CGI page interacts w/your cron job. you are saying that if you stop the cron job, then run that same script from the command line, that something happens on the CGI page? If not that, what command are you running that does make the CGI page work?

    Do you have to be logged in as a certain user on the webpage? Is the squish.pl program supposed to be modifying something that isn't happening, when the job is cronned, vs being run from the terminal?

    I don't really know what squish is in the first place...

  6. #15
    Just Joined!
    Join Date
    Sep 2012
    Posts
    14
    CGI page opens but it's not updating the users usage data when I run the simple program which u given with cron job.
    And if the same program runs in command line with root privileges its updating the users usage data.
    I run the same program both times (in cron job and manually) using root privileges.
    There is no authentication for the webpage.
    Okay, I'll check whether I have to modify the squish.pl program.
    Is there any difference b/w running the program using cron job vs and run the program manually in command line?
    Thank you.
    Last edited by sureshk; 09-30-2012 at 09:40 AM.

  7. #16
    Just Joined!
    Join Date
    Sep 2012
    Posts
    14
    I edited squish.pl file where cron job time located and gave the squishing.sh path, then onwards it's working fine.
    Everything is updating in a fine way.

    I'll check the running status and will update you.
    Thank you.
    Last edited by sureshk; 09-30-2012 at 12:27 PM.

  8. #17
    Trusted Penguin
    Join Date
    May 2011
    Posts
    4,353
    Quote Originally Posted by sureshk View Post
    Is there any difference b/w running the program using cron job vs and run the program manually in command line?
    I was going to say that the PATH environmental variable will be different when run as a cron job, but you've gotten around that hurdle already...

  9. #18
    Just Joined!
    Join Date
    Sep 2012
    Posts
    14
    Till 1 hour - 2 hours squid is running now, later squid is dying like before..
    this is the log of /var/log/squid/cache.log
    Code:
    2012/10/03 16:49:52|   16252928 Entries Validated so far.
    2012/10/03 16:49:52|   16515072 Entries Validated so far.
    2012/10/03 16:49:52|   16777216 Entries Validated so far.
    2012/10/03 16:49:52|   17039360 Entries Validated so far.
    2012/10/03 16:49:52|   Completed Validation Procedure
    2012/10/03 16:49:52|   Validated 18121787 Entries
    2012/10/03 16:49:52|   store_swap_size = 200165956
    2012/10/03 16:49:56| storeLateRelease: released 0 objects
    2012/10/03 16:49:56| client_side_request.cc(1040) clientRedirectDone: redirectin
    g body_pipe 0x7fa9d722a298*1 from request 0x7fa9d7229c00 to 0x7fa9d72a05c0
    2012/10/03 16:50:01| WARNING: swapfile header inconsistent with available data
    FATAL: Received Segment Violation...dying.
    2012/10/03 16:50:01| storeDirWriteCleanLogs: Starting...
    2012/10/03 16:50:01| WARNING: Closing open FD  121
    -------------------------------------------------------------------------------------------
    2012/10/03 16:50:33| Swap maxSize 512000000 + 2097152 KB, estimated 39545934 objects
    2012/10/03 16:50:33| Target number of buckets: 1977296
    2012/10/03 16:50:33| Using 2097152 Store buckets
    2012/10/03 16:50:33| Max Mem  size: 2097152 KB
    2012/10/03 16:50:33| Max Swap size: 512000000 KB
    2012/10/03 16:50:33| Version 1 of swap file without LFS support detected...
    2012/10/03 16:50:33| Rebuilding storage in /var/log/squid/cache (CLEAN)
    2012/10/03 16:50:33| Version 1 of swap file without LFS support detected...
    2012/10/03 16:50:33| Rebuilding storage in /var/log/squid/cache1 (CLEAN)
    2012/10/03 16:50:33| Using Least Load store dir selection
    2012/10/03 16:50:33| Set Current Directory to /var/spool/squid
    2012/10/03 16:50:33| Loaded Icons.
    2012/10/03 16:50:33| Accepting  HTTP connections at [::]:3128, FD 121.
    2012/10/03 16:50:33| HTCP Disabled.
    2012/10/03 16:50:33| Squid modules loaded: 0
    2012/10/03 16:50:33| Adaptation support is off.
    2012/10/03 16:50:33| Ready to serve requests.
    2012/10/03 16:50:33| Store rebuilding is 0.09% complete
    2012/10/03 16:50:54| Reconfiguring Squid Cache (version 3.1.4)...
    2012/10/03 16:50:54| FD 121 Closing HTTP connection
    2012/10/03 16:50:54| assertion failed: disk.cc:377: "fd >= 0"
    --------------------------------------------------------------------
    two types errors found in cahe.log and separated with a line.

    I googled something about these but didn't got any solution.

  10. #19
    Trusted Penguin
    Join Date
    May 2011
    Posts
    4,353
    perhaps you can try building the latest squid from source?

    and/or updating your kernel to the latest stable release?

Page 2 of 2 FirstFirst 1 2

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •