I have a damaged .tar.gz archive, so I decided to follow the steps detailed on this very handy website for repairing corrupt gzip files.

I then ran the program gzrecover on my corrupt file, yielding a file Backup.tar.recovered. Using a patched version of tar (see the instructions on the website if interested) I then tried to extract any salvaged data from Backup.tar.recovered using the command:

 $ /usr/local/bin/tar --recover --ignore-zeros -xvf Backup.tar.recovered
(NB I needed to write the full path to tar due to the fact that I had to use an obsolete version rather than the one installed on my system in order for the recovery patch to work.)

Many files were subsequently extracted to the working directory (albeit incredibly slowly!), but then I encountered an error:

 /usr/local/bin/tar: memory exhausted
and I can't for the life of me work out why this might be the case. The archive I'm dealing with here is only ~230Mb in size, and none of the individual files within it is particularly large. I have 256Mb of RAM and a 20Gb HDD. Every time I repeat the process it encounters the same error at exactly the same point.

Any ideas as to why this might be happening?