[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: wget -- for/curl and wget -l/-r



From: bentley rhodes <bntly.rhds@gmail.com>
> k .. i typed in 'wget http://soso.fake.com/file1.ext
> and it retrieved it.  i read the --help file, and learned that there is 
> a way to recursively search folders too.  but what i can't figure out, 
> is that if there is more than one file i want, like if there is 
> file1.ext through file10.ext, what do i type in to get all those?

If the files are directly linked to from a single page, you can try something like:  
  wget -l1 -r http://soso.fake.com/index.html

(assuming index.html was the page with the links to all 10 files).

The "-l1" tells it to go only one level deep, so you don't get links of links of links, only
one depth of links from the page you sent.

BTW, if you know the exact URL to each file, you can use a simple bash for loop
and curl instead.  E.g.,
  for i in `seq 1 10`; do curl -O http://soso.fake.com/file${i}.ext; done

Some people like to use the formal, C-syntax like for loop instead of the 
"foreach" I used above.  But the nice thing about using above is that you
can pass a format.

E.g., let's say the files were file001.ext to file010.ext:  
  for i in `seq -f %03g 1 10`; do curl -O http://soso.fake.com/file${i}.ext; done

You can pass a format to the "seq" command, and it will create a sequent of numbers with the afforementioned format.  I.e., the above example expands to literally:  
  for i in 001 002 003 004 005 006 007 008 009 010; do curl ... (cut)

It's great when you have hundreds of files to fetch, such as packages, disc
images, porn^H^H^H^Hphoto images, etc...  ;->



--
Bryan J. Smith   mailto:b.j.smith@ieee.org


-
To unsubscribe, send email to majordomo@silug.org with
"unsubscribe silug-discuss" in the body.