Find the answer to your Linux question:
Results 1 to 4 of 4
Hello, my hosting control panel can backup hosting files this ways uncompressed compressed (tar.gz) or incremental non compressed there is huge amount of files, totalling around 30gb of data maybe. ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Linux User postcd's Avatar
    Join Date
    Apr 2011
    Posts
    325

    Way to backup to remote server most effectivelly


    Hello,

    my hosting control panel can backup hosting files this ways

    uncompressed
    compressed (tar.gz)
    or incremental non compressed

    there is huge amount of files, totalling around 30gb of data maybe.

    i wish to keep local backup and also do external copy (ftp, scp). can you help tell what is most effective way (least server resources usage)? mainly im cornerned about minimal disk i/o and cpu.

    now im taring the incremental backup files (12gb of files) and it takes around 2 hours to bakcup it. so not good probably? so what way to have local backup + external with least server resources?
    "Avoid the Gates of Hell. Use Linux affordable VPS."

  2. #2
    Linux Engineer docbop's Avatar
    Join Date
    Nov 2009
    Location
    Woodshed, CA
    Posts
    943
    Setting up your backups requires a lot of planning.

    What really needs to be backed up. What files are static and which are being updated constantly.
    How big is my backup window and what time is that.
    How many copies of the backup do I need to keep and where local, off site, multiple off-site.
    How much load does the backup take.
    Backing up the database how long will it be offline.
    How much data can I afford to lose seconds, minutes, hours???
    What am I backing up for prevent data loss, disaster recovery, legal.
    What types of backup just a restore or for archival (back to legal requirements).

    Work through those and you will be able to start answering your questions. Doing backup right is a complex job when for real production work. I remember being in meetings discuss how many levels of "Murphy" do we need to survive.
    A lion does not lose sleep, over the opinion of sheep.

  3. #3
    Linux User postcd's Avatar
    Join Date
    Apr 2011
    Posts
    325
    i refine my question. What commands can be most efficient to backup 500,000 files totaling 25gb?

    aim is to backup to remote scp or ftp server the way that not authorized person canot abuse data even it gaina ccess to backup server. no military grade, just to keep away casual people
    "Avoid the Gates of Hell. Use Linux affordable VPS."

  4. #4
    Linux Engineer docbop's Avatar
    Join Date
    Nov 2009
    Location
    Woodshed, CA
    Posts
    943
    Quote Originally Posted by postcd View Post
    i refine my question. What commands can be most efficient to backup 500,000 files totaling 25gb?

    aim is to backup to remote scp or ftp server the way that not authorized person canot abuse data even it gaina ccess to backup server. no military grade, just to keep away casual people
    Use the ones you mentioned in your other post tar and make a compress tarball, the openssl to encrypt it. I would still say do some planning it sounds like you are just make one big backup, that's going to chew up resources and be slow to transfer. If me I would break things down in to multiple backups and which are key and need to be done daily version ones that don't need to be backed up that often.
    A lion does not lose sleep, over the opinion of sheep.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •