Results 1 to 2 of 2
Enjoy an ad free experience by logging in. Not a member yet? Register.
- Join Date
- Dec 2006
script to copy 80GB files using ftp
I need to copy about 80 GB of files from ftp1 onto ftp2/old.
FXP client does not work between the two sites on my servers, and I am forced to use a command line ftp client. I was able to zip few directories, and mput all the zip files, but unfortunately zip does not create files larger than 2GB.
Mput does not work on directories, and I am kind of stuck here. It seems like I need a script to do the job, but can't come up with one. I am trying to avoid downloading and uploading all the data, so if you can help me out I'd appreciate it.
Linux: infong421 220.127.116.11-20061004a-areca
shell access: on
root access: off
- Join Date
- Jun 2006
If you do want all directories to get downloaded and the uploaded recursively with all its contents, you can use lftp client and run mirror directory_name. Read the following link : -
You can also write a script and automate everything using lftp. Read the following link to see an example and write one as per your requirements : -
Of course, read the man page of lftp to learn more or in case you have doubts.