Find the answer to your Linux question:
Results 1 to 4 of 4
I have been moving 800 Gb of files from one server to another with this command: Code: wget -r -c -nH -N --cut-dirs=3 ft p://user:pass at serverip/path/to/files Now, for some ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Feb 2013
    Posts
    34

    too many space used


    I have been moving 800 Gb of files from one server to another with this command:

    Code:
    wget -r -c -nH -N --cut-dirs=3 ft p://user:pass at serverip/path/to/files
    Now, for some reason my 1 Tb hard drive got filled but that's not possible because it hasn't downloaded all files and my whole site is way smaller than 1 Tb, I guess the problem is the way how wget restore files what do you think ? how can i fix it ?
    Thank you.

  2. #2
    Just Joined!
    Join Date
    Nov 2012
    Location
    Tunis, Tunisia
    Posts
    9
    Hi,
    this can solve your problem, but i think the second solution is more reliable and secure.

    Code:
    wget -nc -r ftp://user:pass at serverip/path/to/files
    you can use scp for secure copy, like this:

    Code:
    scp -r directory1 directory2 .. user@IP_Address:/New_Local
    Best Regards

  3. #3
    Just Joined!
    Join Date
    Feb 2013
    Posts
    34
    Do I have to delete all files and download them again ?

  4. #4
    Just Joined!
    Join Date
    Nov 2012
    Location
    Tunis, Tunisia
    Posts
    9
    yes, and try with the second mathed
    good luck

    Best Regaeds

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •