How can I download files (that are listed in a text file) using wget or some other automatic way?
Sample file list:
www.example.com/1.pdf
www.example.com/2.pdf
www.example.com/3.pdf
wget has a built-in flag for this: wget -i your_list, where your_list is a file containing URL's delimited by linebreaks.
You can find this kind of thing by reading man wget
Get them in parallel with
cat urlfile | parallel --gnu "wget {}"
By default it will run as many processes as you have cores, you can probably ramp this up another 10x if you really want to pull them down quickly by adding "-j 20" after parallel.
parallel has a built-in flag --arg-file (-a) that will use an input-file as the source, so you can avoid cat |. You can use
parallel --gnu -a urlfile wget
Or simply parallel --gnu wget < urlfile
I saw Florian Diesch's answer.
I got it to work by including the parameter bqc in the command.
xargs -i wget -bqc 'http://{}' < download.txt
All downloads started in parallel in the background.
-b: Background. Go to background immediately after start-q: Quiet. Turn off wget's output-c: Continue. Continue getting a partially-downloaded fileI just tested this:
xargs -a download_file -L1 wget
It works for me. Links inside the txt file must be in separate lines.