Find the answer to your Linux question:
Results 1 to 8 of 8
I run into major memory leakage when I run the "find" and "tar" commands in Linux Fedora Core 2. I have 2 gig of RAM on the server and there ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Mar 2005
    Posts
    4

    Major Memory Problem


    I run into major memory leakage when I run the "find" and "tar" commands in Linux Fedora Core 2. I have 2 gig of RAM on the server and there is normally 1.4+ gig free. When I run the "find" or "tar" commands the free memory is usually wiped out within 2 minutes. The memory is not released when the processes have completed. It ususally takes over 12 hours for the memory to slowly free itself.

    Does anyone know what may be causing this, and if so, what can I do to fix it?

  2. #2
    Linux Guru techieMoe's Avatar
    Join Date
    Aug 2004
    Location
    Texas
    Posts
    9,496
    I'm not sure exactly what your question is, but what OS were you running that you usually had 1.4GB free? Linux handles memory differently than MS Windows. It caches all available memory so it can be accessed quicker, thus not leaving large chunks "free" as in MS Windows.
    Registered Linux user #270181
    TechieMoe's Tech Rants

  3. #3
    Just Joined!
    Join Date
    Mar 2005
    Posts
    4
    Linux 2.6.5-1.358smo
    psa v7.1.6_build71041118.17 os_FedoraCore 2

    The problem is that when these utilities are run they chew through the free memory and never seem to release the memory. They reduce both free memory and buffer memory.

  4. $spacer_open
    $spacer_close
  5. #4
    Linux Guru techieMoe's Avatar
    Join Date
    Aug 2004
    Location
    Texas
    Posts
    9,496
    Hmm. Well, this is beyond my realm of expertise, so I'll hand off to someone with more server experience. Good luck.
    Registered Linux user #270181
    TechieMoe's Tech Rants

  6. #5
    Just Joined!
    Join Date
    Mar 2005
    Location
    Seattle, WA
    Posts
    7

    Most likely disk cache

    I've been told that this isn't really a problem. The files being read are probably just being stuck in disk cache which might look like memory being used, but will just be written over when it's needed. You can look at the numbers given in 'top' to get a better idea of what is really going on.

  7. #6
    Just Joined!
    Join Date
    Mar 2005
    Posts
    4
    top is how I determined that this is a problem. top is not showing any processes running, but the memory used is close to 100% ot total.

  8. #7
    Linux Engineer
    Join Date
    Nov 2004
    Location
    home
    Posts
    796
    Running out of free memory is no problem. Free memory is wasted memory, so it will be put to use. It's when you get a lot of swapping that you have memory issued. Though, if you see that 1gb is taken up from untarring a 2 mb file, then you probably have memory issues, and it's probably with tar.

  9. #8
    Linux User
    Join Date
    Jul 2004
    Location
    USA, Michigan, Detroit
    Posts
    329
    What you need to look at is your swap file if you are not swapping much out to disk then you do not have a memory issue. Use the free command.
    Long live the revolution!
    Have a nice day.
    If you want real change vote Libertarian!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •