After several iterations...
for url in $(cat links.txt); do
timeout 10 lynx "$url"
done
Lynx is blocking (and has to be to work) so sleeping doesn't work properly and it also tries to nab stdin which makes piping things "interestingly" difficult. See here for iterating lines in a file.
Lynx can be a bit annoying with its prompts for allowing cookies. You can either change its settings if it's a problem or you can pass in the -accept_all_cookies flag, like so:
for url in $(cat links.txt); do
timeout 10 lynx -accept_all_cookies "$url"
done
Today I learned about the timeout command, so I'm happy.
To print a status at the end, the only way I can see how is to check the URL is okay separately, like so:
for url in $(cat links.txt); do
timeout 10 lynx -accept_all_cookies "$url"
if [[ $(curl -o /dev/null --silent --head --write-out '%{http_code}\n' "$url") -eq "200" ]]; then
echo "Getting $url successful"
else
echo "Getting $url unsuccessful"
fi
done