Find the answer to your Linux question:
Page 1 of 2 1 2 LastLast
Results 1 to 10 of 15
Hi, I've made a simple backup script. can anyone complete what I'm doing? I'm lost at the most important part of it. here's the script Code: #!/bin/bash cd /path/to/where/backupis/ ls ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Jun 2012
    Posts
    12

    how to compress unknown filename


    Hi, I've made a simple backup script. can anyone complete what I'm doing? I'm lost at the most important part of it.

    here's the script

    Code:
    #!/bin/bash
    cd /path/to/where/backupis/
    ls -lh
    # how to compress the showed file on ls -lh?
    tar -zcf unknownfilename.tar.gz *missing* #<--i don't know how?
    I hope someone will get my point.

    Thanks
    J

  2. #2
    Just Joined!
    Join Date
    Jul 2012
    Posts
    2
    pipe the output of ls -lh to the tar command. I didn't tried it. Try and let us know.

  3. #3
    Linux Newbie mactruck's Avatar
    Join Date
    Apr 2012
    Location
    City of Salt
    Posts
    187
    quick question, when you compress the file will you be moving it? also what does the file name format look like? This should be pretty simple. I have to do something like this all the time for work.

  4. #4
    Trusted Penguin
    Join Date
    May 2011
    Posts
    4,353
    Quote Originally Posted by ganitolngyundre View Post
    Hi, I've made a simple backup script. can anyone complete what I'm doing? I'm lost at the most important part of it.

    here's the script

    Code:
    #!/bin/bash
    cd /path/to/where/backupis/
    ls -lh
    # how to compress the showed file on ls -lh?
    tar -zcf unknownfilename.tar.gz *missing* #<--i don't know how?
    you can just use an asterisk to let bash extrapolate that into the contents of the dir, e.g.:
    Code:
    cd /to/where/backupis/
    tar -zcvf backups.tar.gz *
    this would tar up all files in that dir, except hidden ones (starting with a ".").

    if however, you know there will be just one file in that dir, and you want to use it in your tarball name, then:

    Code:
    tarname=$(ls /dir/where/backupis/)
    if [ -z "$tarname" ]; then
      echo dir is empty
    else
      cd /dir/where/backupis/
      tar -zcvf $tarname.tar.gz $tarname
    fi

  5. #5
    Just Joined!
    Join Date
    Jun 2012
    Posts
    12
    I got it work.

    I just follow atreyu's input.

    Thanks for the reply.

  6. #6
    Just Joined!
    Join Date
    Jun 2012
    Posts
    12
    Yes, I'm trying to write a automated backup script that will compress all files on the backup directory that I created. This is for the preparation for backup to amazon s3.

    Thanks,

  7. #7
    Just Joined!
    Join Date
    Jun 2012
    Posts
    12
    Thanks Atreyu,
    However, This is what I want to do. Say for example I created a directory, lets call it /backup. It has a subdirectories (e.g user1, user2, user3, user4). Now, each users will save all their files that they need to backup on their respected subdirectories. My task now is how can I automatically compress all their subdirectories one by one. And is it possible to do it incremental? I mean, if the users do not save a file on their respected folder the script will not run. But if they save another files, it will just add to the compressed dir.

    The output will be
    user1.tar.gz
    user2.tar.gz
    user3.tar.gz
    user4.tar.gz
    Code:
    cd /backup/
    #this will list all subdirectories of it
    ls -lh
    # I'm lost here. how to compress the content of ls -lh one by one?.
    I know cron will do the task automatically. I will schedule the script on weekly basis.

    I know it sounds dumb. but I'm just new to scripting. Sorry for being noob here. I hope someone can help me on this.

    Thank You,
    J

  8. #8
    Trusted Penguin
    Join Date
    May 2011
    Posts
    4,353
    Quote Originally Posted by ganitolngyundre View Post
    Thanks Atreyu,
    However, This is what I want to do. Say for example I created a directory, lets call it /backup. It has a subdirectories (e.g user1, user2, user3, user4). Now, each users will save all their files that they need to backup on their respected subdirectories. My task now is how can I automatically compress all their subdirectories one by one. And is it possible to do it incremental? I mean, if the users do not save a file on their respected folder the script will not run. But if they save another files, it will just add to the compressed dir.

    The output will be
    user1.tar.gz
    user2.tar.gz
    user3.tar.gz
    user4.tar.gz
    Code:
    cd /backup/
    #this will list all subdirectories of it
    ls -lh
    # I'm lost here. how to compress the content of ls -lh one by one?.
    I know cron will do the task automatically. I will schedule the script on weekly basis.

    I know it sounds dumb. but I'm just new to scripting. Sorry for being noob here. I hope someone can help me on this.

    Thank You,
    J
    if you want to do incremental backups, you might want to get into using a proper backup solution, like Bacula.

    if you want to keep doing things w/simple shell scripting, then you'll have to keep a log of what you back up, and compare that list to the current contents of the user's dir when your script is run. kind of a hassle, but doable. more on that later.

    to back up files per user, you can do something like this:

    Code:
    cd /backup
    users=$(find . -maxdepth 1 -type d -name 'user*')
    for user in $users; do
      tar zcf ${user}.tar.gz $user
    done
    that would leave a dir structure like:
    /backup/user1/
    /backup/user2/
    /backup/user1.tar.gz
    /backup/user2.tar.gz

    now if you wanted to only back up new/changes files, you'd have to do something like this for each user dir, after making the tar:

    Code:
    find /backup/$user -type f -exec md5sum {} \;|awk '{print $2,$1}'|sort > /backup/${user}.list
    then, the next time the backup script is run, before the tar command is executed, generate a new file listing using the same command (just write the output to a temp file this time). then compare the original list to the temp one, and if they are the same, no new/changed files, thus no need to back up.

    actually, it would probably be better to generate the list first, before backing anything up, then pass the list to tar. that way, you're sure you get the exact same list of files - no race conditions, etc.

  9. #9
    Linux Enthusiast
    Join Date
    Jan 2005
    Location
    Saint Paul, MN
    Posts
    649
    Quote Originally Posted by ganitolngyundre View Post
    Hi, I've made a simple backup script. can anyone complete what I'm doing? I'm lost at the most important part of it.

    here's the script

    Code:
    #!/bin/bash
    cd /path/to/where/backupis/
    ls -lh
    # how to compress the showed file on ls -lh?
    tar -zcf unknownfilename.tar.gz *missing* #<--i don't know how?
    I hope someone will get my point.

    Thanks
    J
    One could do:
    Code:
    tar -zcf unknownfilename.tar.gz $(ls -lh)
    or if you wish the whole directory tree at /path/to/where/backupis/, you can do
    Code:
    cd /path/to/where/
    tar -zcf unknownfilename.tar.gz  backupis

  10. #10
    Just Joined!
    Join Date
    Jun 2012
    Posts
    12
    Thank you so much Ateryu,

    I've learned a lot on your inputs. I just modified the script you've given. Instead of tar I used rsync for as to have an incremental backup.
    Code:
    cd /backup
    users=$(find . -maxdepth 1 -type d -name 'user*')
    for user in $users; do
      rync -avz ${user} /backup2;
    done
    this will give a data structure like:
    /backup2/user1
    /backup2/user2


    This solved my problem on doing backup for the new/change files. However, the backup directories on /backup2 is not compress yet. Is there a way to compress all the backup dir using rsync?

    Correct me if I'm wrong, as you can see I used the option -z on rsync. It says it's for compression. But I think it will only compress while the rsync command is taking place. Not when the rsync is done.

    Can you kindly give me your inputs, feedbacks and expert suggestion on this one.

    I really appreciated what you did for me. Thanks

    J

Page 1 of 2 1 2 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •