hope you can help me out on this one, I am fairly new to bash, so I search and find a lot of examples on which I work on but I am a bit stuck here.
I have 2 scripts.
script1.sh one works fine: it downloads the rss feed of a youtube playlist containing title, artist and url of tracks, downloads these items and store them into playlist.txt.
The thing is, I have other youtube playlists and if I want to use a different rss feed page, I need to manually change the url of the playlist in script1.sh.
So I am writing script2.sh that will download the url with all the different playlists rss feeds and then send them one after the other to script1.sh.
The playlist urls are something like that:
http://gdata.youtube.com/feeds/api/playlists/hashtag_playlist?&max-results=50&fields=entry%28title,link%29&prettyprint=true
So far I have managed on script2.sh to gather the names and hashtags of my 5 playlists in a playlists.txt like this:
- "playlist1" PLYNrAD9Jn2WDWZFp23G3a2tkOBtdeNAbc6
- "playlist2" PLYNrAD9Jn2WCf54CoS22kn3pBX1XUFinE
I also managed to create one text file per playlist and named them from the playlist's name. What I want to do now is send one hashtag after the other each to script1.sh and make it output the urls, title and artist of each playlist in each adequate file. I need to make each one of the lines, in playlists.txt, a variable and pass them into the url of script1.sh but dont really understand how to do that.
I hope I am clear. I can answer any of your question to clarify something I missed.
Thank you in advance
EDIT: I just started writing bash so I am really shit, these must seem really ugly to you. Sorry in advance :)
script1.sh
url="http://gdata.youtube.com/feeds/api/playlists/PLYNrAD9Jn2WDmpu3gNVxIVO8bAiOcQkx7?&max-results=50&fields=entry%28title,link%29&prettyprint=true"
name="SlickSlickSound"
download url rss feed page to utube
wget -O $HOME/Music/"Youtube Playlist"/utube $url 2>&1
extract titles from the downloaded page into temp0 use xml_grep in xml-twig-tools
xml_grep 'title' utube --text_only > temp0
extract urls to temp2
grep -o 'href=['"'"'"][^"'"'"']*&feature=youtube_gdata' utube > temp
strip urls to bare minimum for clarity: www.example.com
sed "s/href='http\?:\/\///" temp > temp1
sed "s/&feature=youtube_gdata//" temp1 > temp2
merge URL and Title temp files in same file output.txt
paste temp2 temp0 > output.txt
script2.sh
url="http://www.youtube.com/user/SterylMreep/videos?flow=grid&view=1"
download page source with all playlists infos
wget -O "$HOME/Music/Youtube Playlist"/Playlists/collection $url 2>&1
extract name of playlists
grep -o 'data-context-item-title=......................' collection > tmp0
strip urls to bare minimum for clarity
sed 's/data-context-item-title=//' tmp0 > tmp2
sed 's/" d...*$/"/' tmp2 > tmp3
Remove spaces and \
sed 's/^[ \t]*//' tmp3 > tmp4
sed 's/\///' tmp4 > tmp5
extract playlists ID to tmp
grep -o 'href="/playlist?list=..................................' collection > tmp
strip urls to bare minimum for clarity
sed 's/href="\/playlist?list=//' tmp > tmp1
create file with names of playlists and hashtags
paste tmp4 tmp1 > playlists.txt
Create files with playlist names
cat tmp4 | xargs touch
What comes next here for passing hashtags in playlists.txt into script1.sh and get titles, artists and urls