Results 1 to 1 of 1
Enjoy an ad free experience by logging in. Not a member yet? Register.
- Join Date
- May 2005
downloading files and compressing them
Is this code rubbish? we are meant to produce a bash script for our first courswork and it is proving a nightmare!!!!
I'm trying to create a fairly simple script which will search a given site for all of the jpg images on the site...its not working though...any ideas?? Am I even using the right commands for this job!!!!? Or is it the piping?
# a friendly greeting
echo hello $USER, hope all is well
echo What is the website you wish to download your images from? Remember to give the full url including http://
echo And what compression ratio would you like? from 0-100 I would recommend a value of 65 to get a balance of file size and quality
echo A lower number gives less quality but a smaller file size, A higher number gives greater quality but makes a bigger file.
# get the jpegs from the site
wget -O - $SITE
#filter out all of the jpeg files
grep -O - -f "\.jpg$" |
#compress the jpeg file to make it smaller (although lower quality) and put it in a directory
| cjpeg - -quality $COMPR > ~/compressed/