Find the answer to your Linux question:
Results 1 to 5 of 5
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1

    Can i output multiple wget files into archive?


    Hello, can i anyhow put (pipe?) multiple wget files

    (wget -mk http://domain.tld/folder)
    (wget -r http://domain.tld/folder)

    into tar archive (tgz, zip ..) ?

    I prefer if files are added into archive directly, not downloading all files to hdd, tarring and then deleting whole folder..

    Thx
    Last edited by postcd; 10-09-2015 at 10:16 PM.
    "Avoid the Gates of Hell. Use Linux affordable VPS."

  2. #2
    Linux Guru Rubberman's Avatar
    Join Date
    Apr 2009
    Location
    I can be found either 40 miles west of Chicago, in Chicago, or in a galaxy far, far away.
    Posts
    13,763
    Show your code. Nothing here indicates how you are tarring/gziping the files.
    Sometimes, real fast is almost as good as real time.
    Just remember, Semper Gumbi - always be flexible!

  3. #3
    Linux Engineer
    Join Date
    Jan 2005
    Location
    Saint Paul, MN
    Posts
    815
    Here is a solution using wget in non-recursive mode

    Code:
    tar jcf mytarball.tbz2 <(wget  -O   - http://example.com/folder/page1.html) \
                                 <(wget  -O  -  http://example.com/folder/page2.html) \
                                 ... 
                                 <(wget -O - http://example.com/folder/pageN.html)
    You would not have meaniful names of the files, as the ?<( commad )" creates a "pipe type file" and passes the STDOUT from the command through the "pipe type file" into the program that is expecting a file to be read.

  4. $spacer_open
    $spacer_close
  5. #4
    Rubberman: i dont have any code, only example commands i mentioned in my first post.

    alf55: thank you, but alas it do not works. i tried. im not downloading separate files. im using mentioned wget -r or wget -mk command where i set only directory which contains many files, even recursivelly.
    "Avoid the Gates of Hell. Use Linux affordable VPS."

  6. #5
    Linux Guru Rubberman's Avatar
    Join Date
    Apr 2009
    Location
    I can be found either 40 miles west of Chicago, in Chicago, or in a galaxy far, far away.
    Posts
    13,763
    Sorry, but I have never tried to do this inline as you wish. I download the files and then stuff them into the gzipped tarball as needed before removing the source files. You might try using `wget:...` or echo `wget:...` which would return the file names and files in order as an argument to the tar command.
    Sometimes, real fast is almost as good as real time.
    Just remember, Semper Gumbi - always be flexible!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •