Find the answer to your Linux question:
Page 2 of 2 FirstFirst 1 2
Results 11 to 18 of 18
how does this pertain to the firfox pipelining feature? I think it works by opening multiple connections on the host and thus utilizing the clients bandwidth to the fullest. (this ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #11
    Linux Guru kkubasik's Avatar
    Join Date
    Mar 2004
    Location
    Lat: 39:03:51N Lon: 77:14:37W
    Posts
    2,396

    how does this pertain to the firfox pipelining feature? I think it works by opening multiple connections on the host and thus utilizing the clients bandwidth to the fullest. (this is a guess, i could be really wrong about that)
    Avoid the Gates of Hell. Use Linux
    A Penny for your Thoughts

    Formerly Known as qub333

  2. #12
    Linux Guru
    Join Date
    Apr 2003
    Location
    London, UK
    Posts
    3,284
    Quote Originally Posted by qub333
    how does this pertain to the firfox pipelining feature? I think it works by opening multiple connections on the host and thus utilizing the clients bandwidth to the fullest. (this is a guess, i could be really wrong about that)
    That wouldnt be a problem at all. 1 request would be for a PHP page and the rest would be for images which the server can handle no problem.

    What causes the problem is 100's of requests for php pages using database connections in the space of a few seconds. Only scripts do this sort of thing.

    Jason

  3. #13
    Linux Guru kkubasik's Avatar
    Join Date
    Mar 2004
    Location
    Lat: 39:03:51N Lon: 77:14:37W
    Posts
    2,396
    thought so, just wanted to make sure, ive dabbled with those type scripts before, I used one to download the gentoo handbook, but Ive found them to be a pain to make and operate. And it wasnt untill if was baffled by incredably slow resonse tiems on some of my servers that i understood what a pain they are. I guess if you really wanted the whole thing, you could be curtious and set the script for one page at a time with a 5 sec cooldown between or summin, take a long time, but at least that way othres can still use the forum.
    Avoid the Gates of Hell. Use Linux
    A Penny for your Thoughts

    Formerly Known as qub333

  4. #14
    Linux Enthusiast scientica's Avatar
    Join Date
    Sep 2003
    Location
    South- or "Mid-" Sweden
    Posts
    742
    Jason, how about adding some line like:
    "sleep(1);" or "usleep(250);" before/after some database ops (of better on top all pages (it would take n ()secons before the client would recieve any more links to traverse)) - not much of a slow down for _users_ to notice, but it'll give server (much) more time I belive.
    -- it would slow down scripts the most, as they'd have to wait -at least- the sleep time specified before next db query or the next page of links to assault.

    (tell me if you don't understand what I mean and I'll try to examplify)
    Regards Scienitca (registered user #335819 - http://counter.li.org )
    --
    A master is nothing more than a student who knows something of which he can teach to other students.

  5. #15
    Linux Guru sdousley's Avatar
    Join Date
    Feb 2004
    Posts
    1,790
    Jason, i don't know if it's connected with this problem at all, but i had a little problem gettin onto the site this morning at about 10:10am UK time. on Sunday 12 Sept. Then when i finally got on, it said there were no new posts since my last visit, when i know there were more posts.
    "I am not an alcoholic, alcoholics go to meetings"
    Registered Linux user = #372327

  6. #16
    Linux Engineer
    Join Date
    Jul 2003
    Location
    Stockholm, Sweden
    Posts
    1,296
    does this ban aply to a recursive "wget" ? not that i was planning on doing it, but im on 56k and can't see it being too much of a problem for the server if i did decide to do it.

  7. #17
    Linux Guru
    Join Date
    Apr 2003
    Location
    London, UK
    Posts
    3,284
    Quote Originally Posted by variant
    does this ban aply to a recursive "wget" ? not that i was planning on doing it, but im on 56k and can't see it being too much of a problem for the server if i did decide to do it.
    Any automated hit bots, file getters, page grabbers, custom scripts are all a problem, even if they "take it slowly". I'll explain why.

    Site speed
    While it is very easy to say it shouldnt be a problem if 1 person has a script running, from a performance standpoint it does become a real issue when 100 people do it.

    Loss of revenue
    We have both paid advertisers and banner exchanges for 468x60 banners that you see in the top right of your screen (new design). Every time a page is requested, we have delivered 1 banner impression. Our advertisers currently pay per 1,000 impressions.

    A big problem arises when scripts request multiple pages because the number of impressions goes up, while the number of clicks in comparison stays low/doesnt move, which means the advertiser gets bad value for money.

    This can lead to 2 things happening, 1) the advertiser doesnt renew with us because we didnt deliver value for money, and 2) we have to lower our prices because we deliver less clicks per 1,000 impressions.

    Yes, I've put in a good set of measures to stop us billing for robotic impressions and to protect our advertisers and supporters, but scripts can still occaisionally create issues.


    Aggravation and a waste of my time
    I have to trawl through HTTP logs to find people running these scripts, then i have to write letters to their ISP, and sort out blocking them from accessing my server again. This has taken hours of my time already, and this is time i could have better spent using developing new features for the site and promoting it. Once again, it hurts the site in the long run.



    So far I have banned 3 people on the basis of running automated scripts, and will continue to ban, block IP's, and email ISP's until the message sinks in.

    To summarise: Any type of automated or recursive script/program/whatever you want to call it, is not permitted.

    Jason

  8. #18
    Linux Guru
    Join Date
    Apr 2003
    Location
    London, UK
    Posts
    3,284
    Quote Originally Posted by scientica
    Jason, how about adding some line like:
    "sleep(1);" or "usleep(250);" before/after some database ops (of better on top all pages (it would take n ()secons before the client would recieve any more links to traverse)) - not much of a slow down for _users_ to notice, but it'll give server (much) more time I belive.
    -- it would slow down scripts the most, as they'd have to wait -at least- the sleep time specified before next db query or the next page of links to assault.
    While i see what you are saying, i think the proper way to fix the problem and improve performance generally is to optimise SQL queries, and tune the database and site code, which i am almost constently doing.

Page 2 of 2 FirstFirst 1 2

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •