1

I spent the whole night trying to get the proccess done but all my attempts ends with failure.

I write a very simple script to clear what I'm trying to do please copy it and try to power it up.

    #!/bin/bash
set -x
urls='http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3'
#urls="http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3"


for letsgo in `curl -OLJg  "'${urls}'"` ; do
echo "GOT TRIED OF TRYING"
done

# for letsgo in `curl -OLJg $urls` ; do
#echo "GOT TRIED OF TRYING"
# done

The result which I got after starting it up

First Loop Way:-

./ap2.sh
+ urls='http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3'
++ curl -OLJg ''\''http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3'\'''

curl: (1) Protocol 'http not supported or disabled in libcurl
+ for letsgo in '`curl -OLJg  "'\''${urls}'\''"`'
+ echo 'GOT TRIED OF TRYING'
GOT TRIED OF TRYING

Second Loop

./ap2.sh
+ urls='http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3'
++ curl -OLJg http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine '(Original' 'Mix).mp3'
curl: option -: is unknown
curl: try 'curl --help' or 'curl --manual' for more information

The problem is something I don't know where is escaping the url without my permission and get the things not to work probably.

Update

I get rid of it by using

for letsgo in `curl -OLJg  "${urls}"` ; do
echo "Working Fine But We Still Have Problem When We Are Using More Than 1 URL"
done

The problem when the script have more than one more url each of them must be in quotes (Only for my case) to get the curl working probably. I can do it manually in linux console without any problem but when it comes to using a BASH script the result of these script will be

#!/bin/bash
set -x
urls="'http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3' -OLJg 'http://webprod15.megashares.com/index.php?d01=3109985&lccdl=9e8e091ef33dd103&d01go=1' -OLJg "



for letsgo in `curl -OLJg "${urls}"` ; do
    echo "Working Fine But We Still Have Problem When We Are Using More Than 1 URL"
done

Results:-

+ urls=''\''http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3'\'' -OLJg '\''http://webprod15.megashares.com/index.php?d01=3109985&lccdl=9e8e091ef33dd103&d01go=1'\'' '
++ curl -OLJg ''\''http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3'\'' -OLJg '\''http://webprod15.megashares.com/index.php?d01=3109985&lccdl=9e8e091ef33dd103&d01go=1'\'' '

curl: (1) Protocol 'http not supported or disabled in libcurl
+ for letsgo in '`curl -OLJg "${urls}"`'
+ echo 'Working Fine But We Still Have Problem When We Are Using More Than 1 URL'
Working Fine But We Still Have Problem When We Are Using More Than 1 URL

I just want it to work normally the same way as I enter it on linux console without the interrupting which made by bash by escaping strings. Like this way

curl -OLJg 'http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3' -OLJg 'http://webprod15.megashares.com/index.php?d01=3109985&lccdl=9e8e091ef33dd103&d01go=1' -OLJ
Community
  • 1
  • 1
M.A.G
  • 321
  • 1
  • 4
  • 7

6 Answers6

2

You should remove the single quotes in the argument of curl - the way you have written it they become a part of the URL.

Blagovest Buyukliev
  • 42,498
  • 14
  • 94
  • 130
1

Have you ever seen a blank in the browser adressbar? They must be converted to %A20 (corrected by Lucas' comment, thanks) %20 and, maybe, similar special chars, too.

wellurl=$(echo $urls | sed 's/ /%20/g')

I don't know curl - it's something similar to wget, isn't it?

wget -np $wellurl
2011-04-10 16:55:28 (17,2 MB/s) - »An-Beat - Mentally Insine (Original Mix).mp3« gespeichert [191]

worked for me.

update:

To get multiple urls from a script, use an array:

#!/bin/bash
#
declare -a urls
urls=('http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3' 'http://webprod15.megashares.com/index.php?d01=3109985&lccdl=9e8e091ef33dd103&d01go=1')

for i in $(seq 0 ${#urls[@]} )
do 
    wellurl=$(echo ${urls[i]} | sed 's/ /%20/g')
    # echo "$wellurl"
    curl -OLJg "$wellurl"
done
  • ${#urls[@]} returns the number of elements in the array
  • don't put options into the array
  • use smaller urls in your next questions, please :)
user unknown
  • 35,537
  • 11
  • 75
  • 121
  • 1
    Just `%20`. ;) Some servers will allow spaces, but it is indeed a violation of the standards. If arbitrary URLs need to be escaped like that, it can be a bit of a challenge from bash (see http://stackoverflow.com/questions/296536/urlencode-from-a-bash-script). – Lucas Jones Apr 10 '11 at 14:42
  • @user unknown could please review my question again. – M.A.G Apr 10 '11 at 20:49
  • @M.A.G if you review my answer again, yes. – user unknown Apr 10 '11 at 22:37
  • @user unknown . Can you help me please with anyway to bring it to work on the same way I'm making it. I wrote any topic to clear it more. http://stackoverflow.com/questions/5624687/escaping-bash-really-need-help ... I'm really looking for your help – M.A.G Apr 11 '11 at 19:46
  • Does the bash call php, and use the result, or is the bash called from php? I don't see how php is involved here. You have to show what is making the problem. It's not concrete enough. How does the php produce the urls? Why don't you use the php to grab the websites, or why do you use php at all? – user unknown Apr 11 '11 at 20:33
  • @user unknown I use php to be a front-end for users. The php send urls normally to bash. Then bash need to visit each page to produce a new link or to grab a new link to this point everything working fine. When bash grab a url contain any chars like "( - ) '" and many more like the example above it lead to ruin the hole process. – M.A.G Apr 13 '11 at 15:45
  • Please don't be so vague! What is 'sent'? Bash isn't a mail account. Is it a parameter (script.sh url1 url2 url3) or is it a pipe from stdin: (echo url1 url2 url3 | script.sh)? Something else? Usually, word splitting is parameter splitting, so it is essentially, how you invoke the shell exactly, and from where. – user unknown Apr 13 '11 at 16:42
  • @user unknown .. Thanks for your reply and I'm sorry for delay in updating my question. I use php as a frontend to user to submit links and those links handled by bash to download them. Here a simple of a php exec – M.A.G Apr 24 '11 at 01:38
  • @user unknown the example exec("/bin/download \"$command\" \"$userid\" \"$username\" \"$yesno\" \"$queueid\" \"$hml\" \"$uploading\" >/dev/null 2>&1 & echo $!"); => Real /bin/download "'http://www.duckload.com/dl/19Nf2' '-sOLJ' 'http://www.duckload.com/dl/T9Nf2' '-sOLJ' " "1" "whatever" "0" "1819818575" "2" "" and here how bash got it curl -sOLJg -w ',%{url_effective},%{size_download},%{content_type}\n' ''\''http://www.duckload.com/dl/19Nf2'\'' '\''-sOLJ'\'' '\''http://www.duckload.com/dl/T9Nf2'\'' '\''-sOLJ'\'' ' – M.A.G Apr 24 '11 at 01:41
  • @user unknown I just want bash to treat it as it coming from normal linux terminal. without doing any kind of escaping to it. – M.A.G Apr 24 '11 at 01:42
  • Please put such code into your question, for layout reasons, and just post, that you question was updated, in the comment. – user unknown Apr 24 '11 at 02:20
0

try this

urls="http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3"
curl -OLJg  "${urls}" | while read results
do
 ...
done
kurumi
  • 25,121
  • 5
  • 44
  • 52
0

The use of a variable called urls suggests that there will be more than one URL in there. If so, you might consider BASH arrays. Also, the echo message "GOT TIRED OF WAITING" suggests that the curl might fail. If so, you might consider checking for the error more explicitly.

Check out and try running the following:

set -x
list_of_urls=('http://bellatrix.oron.com/jrmxp36wf36yew4veg4fp53kmwlogjeecmopy3n2ja5yqkyzekxwqx2pckq6dtd5hb7duvgk/An-Beat - Mentally Insine (Original Mix).mp3')

for url in "${list_of_urls[@]}"
do
 curl -s -OLJg  "${url}"
 if [ $? -gt 0 ]; then
   echo "$url is a PROBLEM! (return code: $?)" 
 fi
done

One thing I noticed when running this: the server "succeeds" (in other words, $? is equal to 0), but curl returns a file called error.html. This could be another error condition that you trap for. Good luck!

rickumali
  • 707
  • 8
  • 17
  • you can't use `$?` twice like that: the 2nd one will have value 0, the return value for `[ $? -gt 0]`. You have to save `$?` in a variable if you want to use it more than once. – glenn jackman Apr 10 '11 at 23:11
  • @M.A.G: I saw the update to your question. You're passing in all the URLs that you want CURL to process into one CURL command. I'm afraid you'll have to assemble the command line to include the "-OLJg" before each URL. I think this way is "not ideal", and using arrays would be the better approach. Try the script that 'user unknown' or I supplied (but with the warning glenn jackman provided). Good luck! – rickumali Apr 11 '11 at 00:48
0

curl -K doesn't help you? (you can put the urls in file exactly how you seen them in navigation bar) http://curl.haxx.se/docs/manpage.html

Adrian Sevcenco
  • 311
  • 1
  • 8
  • thanks for your help. I tried to read the man page but I didn't get it how to use the curl -K. I searched on the internet but nothing found. I hope you please to explain how I could use it in my situation. – M.A.G Apr 26 '11 at 02:31
  • as far i understood you want to download some urls. make an file lets say urls with the content : url = link/1.txt -O url = link/2.txt -O url = link/3.txt -O url = link/4.txt -O using curl -K urls will donwload all files with their names; alternatively you can use wget -i urls_file where urls file can be a simple list of urls without any other specification. – Adrian Sevcenco Apr 26 '11 at 19:08
  • crappy editor of this site!!! on first line is url = link and on the second line is -O – Adrian Sevcenco Apr 26 '11 at 19:09
  • that -O can be as well your -OLJg – Adrian Sevcenco Apr 26 '11 at 19:10