I am very new to linux, and have just enrolled in a course on it.

Is this code rubbish? we are meant to produce a bash script for our first courswork and it is proving a nightmare!!!!

I'm trying to create a fairly simple script which will search a given site for all of the jpg images on the site...its not working though...any ideas?? Am I even using the right commands for this job!!!!? Or is it the piping?

# a friendly greeting

echo hello $USER, hope all is well

echo What is the website you wish to download your images from? Remember to give the full url including http://

read SITE

echo And what compression ratio would you like? from 0-100 I would recommend a value of 65 to get a balance of file size and quality

echo A lower number gives less quality but a smaller file size, A higher number gives greater quality but makes a bigger file.

# get the jpegs from the site

wget -O - $SITE

#filter out all of the jpeg files

grep -O - -f "\.jpg$" |

#compress the jpeg file to make it smaller (although lower quality) and put it in a directory

| cjpeg - -quality $COMPR > ~/compressed/

[/code]