Results 1 to 4 of 4
I've just done the first deep update of my system. To my surprise and satisfaction, it recompiled specifically for my machine quite a few packages that had originally come with ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
- 04-05-2009 #1
Why does a build create so many temporary files?
While gcc was compiling, I monitored the number of files the build was creating by using df -i periodically (I already knew it was a large number). The number of inodes consumed was about 55 thousand! That is grotesque! What kind of a process needs 55,000 files? I know that the compiler must create an object file for each source file but were there really 27,000 source files in this package? Just what is going on here?
- 04-05-2009 #2
- 04-06-2009 #3
- Join Date
- Nov 2007
- Córdoba (Spain)
The source package contains more than 10,000 .c files, and a total amount of +60,000 files, add to that the intermediate files and you can start realizing the magnitudes we are talking about.
$ tar xf /var/portage/distfiles/gcc-4.3.3.tar.bz2 $ find gcc-4.3.3/ -name \*.c|wc -l 11926 $ find gcc-4.3.3/ -name \*|wc -l 61659
- first, gcc is compiled using whatever compiler you have on your system
- second, gcc is re-compiled using the gcc version that you just compiled
- third, gcc is recompiled again using the gcc produced on the second step, then both are compared (the third and the second) to see if they are 1:1 the same and ensure that there's no problem.
Now you can start wondering how the number of intermediate files is really huge.
And don't forget that in amd64 if you have multilib enabled two entire compilers are created for 32 and 64 bits, that means that gcc will be recompiled 6 times, and will double the total number of files once again.
- 04-06-2009 #4