Find the answer to your Linux question:
Results 1 to 4 of 4
Hi, I'm trying to loop wget 100 times to download from a website. The format url of the server is hxxp://server.com/app/browse/DownloadAsset.aspx?id=xxx]server.com]server.com[/url] I've tried the following Code: for i in {1..100}; ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Oct 2011
    Posts
    2

    Exclamation bash loop wget using variable


    Hi,
    I'm trying to loop wget 100 times to download from a website. The format url of the server is
    hxxp://server.com/app/browse/DownloadAsset.aspx?id=xxx]server.com]server.com[/url]

    I've tried the following
    Code:
    for i in {1..100}; do wget hxxp://server.com/app/browse/DownloadAsset.aspx?id=$i; done
    each time I do this I get a 404 html error. I'm thinking the wget isn't parsing the variable at the end of the url correctly. Any ideas what's happening here? I know the files do exist.

  2. #2
    Trusted Penguin Irithori's Avatar
    Join Date
    May 2009
    Location
    Munich
    Posts
    3,345
    You need to quote the URL.

    No idea, what the purpose of this loop is, but you probably also want to send the downloaded content to /dev/null
    You must always face the curtain with a bow.

  3. #3
    Just Joined!
    Join Date
    Oct 2011
    Posts
    2
    I'm trying to download files that exist.Here is a valid one:
    eircomwholesale.ie/WorkArea/DownloadAsset.aspx?id=345
    I want to get all files between 340 and 350. I've tried what you suggested but do no avail ,still 404s.

  4. #4
    Trusted Penguin
    Join Date
    May 2011
    Posts
    4,353
    This works for me...
    Code:
    #!/bin/bash
    url='www.eircomwholesale.ie/WorkArea/DownloadAsset.aspx?id'
    #for i in {1..100}; do
    for i in {345..346}; do
       wget -O file${i}.pdf ${url}=${i}
    done

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •