I'm tryng to make a bash script to download and trim videos from URLs in a .txt
file, using ffmpeg
and youtube-dl
. From the Internet I found this https://askubuntu.com/questions/970629/how-to-download-a-portion-of-a-video-with-youtube-dl-or-something-else and this How can I batch/sequentially download m3u8 files using ffmpeg? and based on that I made this:
#!/bin/bash
#only download the half of urls
HoldList="/home/user/desktop/dir1/code/bash/web.txt"
index=0
while read line ; do
ffmpeg -ss 00:50:30 -to 00:51:00 -i "$(youtube-dl -f best --get-url $line)" -c:v copy -c:a copy output-${index}.mp4
((index=index+1))
done < "$HoldList"
This code only downloads half of the videos. Download one, ignore the next, then repeat...
How can I make not skip every other URL from the file?
I'm a newbie in Bash script (and in this site), and English is not my first language.