165

How can I download files (that are listed in a text file) using wget or some other automatic way?

Sample file list:

www.example.com/1.pdf
www.example.com/2.pdf
www.example.com/3.pdf
rav_kr
  • 124
  • 4
Sourav
  • 2,325

8 Answers8

279

wget has a built-in flag for this: wget -i your_list, where your_list is a file containing URL's delimited by linebreaks. You can find this kind of thing by reading man wget

aureianimus
  • 3,586
91

Get them in parallel with

cat urlfile | parallel --gnu "wget {}"

By default it will run as many processes as you have cores, you can probably ramp this up another 10x if you really want to pull them down quickly by adding "-j 20" after parallel.

meawoppl
  • 1,054
14

parallel has a built-in flag --arg-file (-a) that will use an input-file as the source, so you can avoid cat |. You can use

parallel --gnu -a urlfile wget

Or simply parallel --gnu wget < urlfile

yxogenium
  • 141
10
xargs -i wget 'http://{}'  < your_list
5
awk '{print "http://" $0;}' list.txt | xargs -l1 wget

where list.txt is your list file

cbix
  • 187
5

I saw Florian Diesch's answer.

I got it to work by including the parameter bqc in the command.

xargs -i wget -bqc 'http://{}' < download.txt

All downloads started in parallel in the background.

  • -b: Background. Go to background immediately after start
  • -q: Quiet. Turn off wget's output
  • -c: Continue. Continue getting a partially-downloaded file
muru
  • 207,228
2

Link file links.txt

Command for down load all links file

cat links.txt | wget -i
Kulfy
  • 18,154
0

I just tested this:

xargs -a download_file -L1 wget

It works for me. Links inside the txt file must be in separate lines.

Kulfy
  • 18,154