Find the answer to your Linux question:
Results 1 to 2 of 2
Hi, I need to copy about 80 GB of files from ftp1 onto ftp2/old. FXP client does not work between the two sites on my servers, and I am forced ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Dec 2006
    Posts
    1

    script to copy 80GB files using ftp


    Hi,

    I need to copy about 80 GB of files from ftp1 onto ftp2/old.
    FXP client does not work between the two sites on my servers, and I am forced to use a command line ftp client. I was able to zip few directories, and mput all the zip files, but unfortunately zip does not create files larger than 2GB.
    Mput does not work on directories, and I am kind of stuck here. It seems like I need a script to do the job, but can't come up with one. I am trying to avoid downloading and uploading all the data, so if you can help me out I'd appreciate it.

    Linux: infong421 2.6.17.7-20061004a-areca
    shell access: on
    root access: off


    Thank you

  2. #2
    Linux User
    Join Date
    Jun 2006
    Posts
    311
    Hi jarekl,
    If you do want all directories to get downloaded and the uploaded recursively with all its contents, you can use lftp client and run mirror directory_name. Read the following link : -

    http://www.linuxforums.org/forum/red...get-cmd.html#4

    You can also write a script and automate everything using lftp. Read the following link to see an example and write one as per your requirements : -

    http://www.linuxforums.org/forum/red...ftp-fd5.html#2

    Of course, read the man page of lftp to learn more or in case you have doubts.

    With Regards,
    Thinker

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •