ftp tar file size limit?
I am trying to back up my linux box to my windows box's hard drive. To do this I am using the Knoppix distro to boot my linux box. Then I am taring and ftping every file and sending it to my windows box through ftp. (I wanted to tar the files first, so I can preserve permissions) On my windows xp box I am running filezilla's ftp server, and I am transfering to an external external 320Gb NTFS formated hard drive attached to to it through usb. I don't have enough space left on my linux box to tar everything and then transfer, so I am using the following commands:
ftp 192.168.1.101 21
put |"tar -cvlO *.*" stuff.tar
It always stops transfering just before 2Gb (1,972,460KB), and the file should be 20Gb or so. What am I doing wrong? Is there some file size limit that I don't know of for ftp or tar? The NTFS files systems should allow bigger files from what I have read. I couldn't find any limit for filezilla. Is this the right place to ask?
I believe NTFS has a 2GB file limitation unless you are running a storage driver with 44-bit LBA support.
Everywhere I have read the NTFS limit is in the tens of Terabytes range. I have some files that are bigger than that now.
tar file size limit
Generally, tar can't handle files larger than 2GB. I suggest using an alternative to tar, 'star'. A more comprehensive answer is available here:
By the looks of it, gnu tar versions newer than 1.12.64 can handle large files but I can't confirm this.
I have a similar problem with big files:
I have a 2.2 Gig file on a linux computer. And i mounted Shared Documents(smbfs) from a another(windows) computer. So when i try to copy it it stops at 2 GB. I even tried moving the file in apache, so i can download the file, but apache won't let me.
I can't archive it either.
Is there any way to move that file?
If you are using smbclient, then follow the kbase article:
redhat.com | Knowledgebase