0

I'm tryng to make a bash script to download and trim videos from URLs in a .txt file, using ffmpeg and youtube-dl. From the Internet I found this https://askubuntu.com/questions/970629/how-to-download-a-portion-of-a-video-with-youtube-dl-or-something-else and this How can I batch/sequentially download m3u8 files using ffmpeg? and based on that I made this:

#!/bin/bash

#only download the half of urls

HoldList="/home/user/desktop/dir1/code/bash/web.txt"

index=0
while read line ; do
    ffmpeg -ss 00:50:30 -to 00:51:00 -i "$(youtube-dl -f best --get-url $line)" -c:v copy -c:a copy output-${index}.mp4
    ((index=index+1))
done < "$HoldList"

This code only downloads half of the videos. Download one, ignore the next, then repeat...

How can I make not skip every other URL from the file?

I'm a newbie in Bash script (and in this site), and English is not my first language.

tripleee
  • 175,061
  • 34
  • 275
  • 318
sl4g
  • 1
  • 1
  • Or just move your input to a different fd. `while IFS= read -r line <&3; do`, and `done 3<"$HoldList"` to use FD 3, as an example. – Charles Duffy Dec 04 '20 at 00:28
  • If that fixes it, this is a duplicate of [read file as input for a command skipping lines](https://stackoverflow.com/questions/36771443/read-file-as-input-for-a-command-skipping-lines). Indeed, it's _almost certainly_ a duplicate. – Charles Duffy Dec 04 '20 at 00:29
  • yes! this was the solution, thanks you a lot. Now i have to make vars for the time parameters for ffmpeg to make it more dinamic. Again thanks! – sl4g Dec 04 '20 at 01:54

0 Answers0