Find the answer to your Linux question:
Results 1 to 4 of 4
Hi everybody, How can I limit the memory for a particular service. For ex: I want to restrict the memory for apache service to 50 MB. please help me. thanks ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Linux Newbie
    Join Date
    Jul 2004
    Posts
    143

    limit the resources


    Hi everybody,

    How can I limit the memory for a particular service.
    For ex: I want to restrict the memory for apache service to 50 MB.

    please help me.
    thanks & regards,
    ygoendra

  2. #2
    Just Joined!
    Join Date
    Jun 2005
    Location
    Canada, Halifax
    Posts
    86
    Resource limits are imposed by the kernel and are configured via "limits.conf" and PAM.http://www.debian.org/doc/manuals/se.../index.en.html This is a Debian GNU/Linux specific document, however I'm sure the variations from distro-to-distro are manageable. Give it a try then fork-bomb yourself to see it work!

    Pay close attention to section 4.10. Basically you've a two step process: 1. configure PAM to use limits, and 2. configure the limits.

  3. #3
    Just Joined! Silent's Avatar
    Join Date
    Dec 2005
    Location
    Pakistan
    Posts
    2

    RE: How to restrict Apache

    Helllo,


    Usually Perl and/or CGI scripts can go wild and eat up ALL system resources: this is dangerous stuff can be controlled by three directives. Apache comes with three directives to place limits on the the amount of CPU, memory and processes the server can use.
    i) RLimitCPU - restrict the strain of CPU usage.
    Example: RLimitCPU 10 20

    ii) RLimitNPROC - restrict the number of the processes run simultaneously.
    Example: RLimitMEM 200000 200000

    iii) RLimitMEM - restrict the memory used by processes run on the server.
    Example RLimitNPROC 3 5

    You can above three three directives in vhost or main server configuration. First value to each of above example is soft (minimum) limit and second value is hard (maximum) limit which cannot be crossed by any process. Here is more practical and realistic example to be used in mass hosting server (open your httpd.conf file and add following three directives):

    A) Set Maximum 100 CPU second to be used by a process so Perl process will die if it is continue for more than 100 seconds i.e. Perl scripts may run for no more than 100 seconds. Scripts running longer than 100 seconds will be stopped automatically by the system/Apache.
    RLimitCPU 100 100

    B) Set Maximum of 25 processes at any one time
    RLimitNPROC 25 25

    C) Allow 10 MB to be used per-process
    RLimitMEM 10000000 10000000

    Once added to httpd.conf file restart the apache process.

    Regards
    Silent

  4. #4
    Just Joined!
    Join Date
    Jun 2005
    Location
    Canada, Halifax
    Posts
    86
    Absolutely great advice Silent. You're right, /etc/httpd.conf default settings are more generous than what are required for a small web server. In addition to properly configuring Apache, I'd still suggest that the kernel be configured to limit www-data (default apache user space account) access to system/user resources so that a vulnerability exploited in Apache would be limited to a pocket of system/user space.

    mummaneni used Apache as an example, and while we're on the subject of limiting web-serving resource limits one would also be wise to implement bandwidth-throttling controls via firewall configuration to limit the effects of a SYN flood attack and the like. Network bandwidth is as quantifiable a resource as is processor time and memory space. The gurus can do this with ip tables directly and such, personally I use the shorewall front-end for these matters.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •