Find the answer to your Linux question:
Page 2 of 2 FirstFirst 1 2
Results 11 to 16 of 16
Never mentioned cuttig-edge, nor filesystem. Indeed I am looking for a more mature option than my PERL script, and although I am impressed with what lessfs and opendedup do, I ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #11
    Just Joined! jippie's Avatar
    Join Date
    May 2006
    Location
    Eindhoven, the Netherlands
    Posts
    76

    Never mentioned cuttig-edge, nor filesystem.
    Indeed I am looking for a more mature option than my PERL script, and although I am impressed with what lessfs and opendedup do, I came across too many problems when i was testing them. I really dislike fixed-size containers that I cannot extend nor shrink and the big difference with opendedup and lessfs is that I only want to be able to store several versions of one single file. I don't need hardlinks, folders, extended attribures, high performance ... So why use a heavy tool if there might be something simpler. The PoC is merely to indicate the simple requirement I have and my initial question was if such a simple though robust tool exists

  2. #12
    Linux Guru
    Join Date
    Nov 2007
    Posts
    1,754
    Generating a hash by file or by block, storing that hash, and then retrieving it to compare changed files/blocks for deduplication purposes *is* a cutting-edge function.

    I don't think too many people would put the effort required into doing it right for less than a full filesystem/storage container design. But that's my .02. Good luck in your search.

  3. #13
    Just Joined! jippie's Avatar
    Join Date
    May 2006
    Location
    Eindhoven, the Netherlands
    Posts
    76
    Quote Originally Posted by HROAdmin26 View Post
    Generating a hash [...] for deduplication purposes *is* a cutting-edge function.
    In that case I am searching for a cutting edge solution

  4. #14
    Just Joined!
    Join Date
    Apr 2011
    Posts
    1

    ddar

    I know I'm jumping in a little late here, but...

    Have you considered ddar - a de-duplicating archiver?

    Be interesting to know of any success/failure stories of anyone using ddar to de-dupe and backup 1TB+ samba shares.

  5. #15
    Just Joined! jippie's Avatar
    Join Date
    May 2006
    Location
    Eindhoven, the Netherlands
    Posts
    76
    Quote Originally Posted by stildalf;841107Have you considered ddar - a [B
    d[/B]e-duplicating archiver?
    No I haven't. I'll check it out in a few of days.

    Thnx

  6. #16
    Just Joined! jippie's Avatar
    Join Date
    May 2006
    Location
    Eindhoven, the Netherlands
    Posts
    76
    As stated on the ddar homepage: "ddar is new software. Please use it with caution until it has had wider use"
    I've ran a few quick tests and it seems to work just fine, but for the moment I wouldn't trust it as the only location for production data.
    I will be using it myself in combination with a regular filesystem backup; if the LV image fails I will still have my files, it will only take more effort to recover from a disaster.
    Also you'll need to study its workings and carefully think about how/what you want to recover. In my use case I have no requirements for restoring a single file as I use my file backup for that. It all depends on how you will use this tool, you probably don't want to do a sequential search for a file on a 1TB filesystem, so you'll have to think about how to store your data in the ddar archive.

Page 2 of 2 FirstFirst 1 2

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •