For some reason the output always gets printed to the terminal, regardless of whether I redirect it via 2> or > or |. Is there a way to get around this? Why is this happening?
9 Answers
add the -s
(silent) option to remove the progress meter, then redirect stderr to stdout to get verbose output on the same fd as the response body
curl -vs google.com 2>&1 | less

- 151,563
- 33
- 264
- 304
-
3This works for most websites, but for some reason the local server on my machine still prints the full output, even if I do ` 2>&1 | grep asdfasdfasdfasdfdfs` or some such thing. The full output including headers is still displayed on the console. Is there some other stream that I can pipe into grep to extract some data that I need? – jonderry Mar 25 '11 at 03:51
-
What information are you actually trying to extract, and what information do you want to throw away. I understood your question to mean that you want all of the output of -v directed to stdout. – SingleNegationElimination Mar 25 '11 at 13:44
-
I want to process some of the cookies (basically grep some info from the cookies and do some other stuff). Yes, I want everything to go to std out, so I can process whatever I want via pipes. Currently some of the output just displays on the console and seems impossible to redirect and I'm not sure why. – jonderry Mar 25 '11 at 17:39
-
Can you post a screenshot of the output appearing on screen that you wish to capture? I don't know what kind of output you could possibly be seeing that could possibly be missed by `2>&1`. – SingleNegationElimination Mar 26 '11 at 03:40
-
It's just the same type of output as with any other website. The only difference is that the server is running locally. Is there some way for any program to print to the console but not have that text be captured by stout/sterr? – jonderry Mar 26 '11 at 03:55
-
Could it be that the output you are seeing is not being generated by curl, but rather by a different process (Say, your locally running web server?) – SingleNegationElimination Mar 26 '11 at 15:57
-
Probably not. The web server is being run from eclipse. Also, I would expect something to be captured by 2>&1, but it is not. – jonderry Mar 26 '11 at 17:39
-
It's too many bytes to put here (and formatting multi-line sucks), but I'm getting the behaviour @jonderry mentions, the cut down version looks like this: `$ curl -sL http://localhost:7004/api/v1/
2>&1 | grep -q " – Vala Dec 07 '17 at 12:12" && echo OK || echo FAIL
[1] 23265
FAIL
$ {}` The response doesn't go through grep, it definitely matches what I'm looking for. It kind of looks like it's spawning a child process for some reason. Can't imagine why. -
@Thor84no if you are not using the `-v` option to `curl`, then you also don't want/need `2>&1`, if that doesn't help, please post a new question https://stackoverflow.com/questions/ask – SingleNegationElimination Dec 08 '17 at 18:03
-
3You can also use [`--stderr -`](https://unix.stackexchange.com/a/210294/209677) curl parameter to redirect stderr to stdout. – Pablo Bianchi Mar 21 '18 at 20:30
Your URL probably has ampersands in it. I had this problem, too, and I realized that my URL was full of ampersands (from CGI variables being passed) and so everything was getting sent to background in a weird way and thus not redirecting properly. If you put quotes around the URL it will fix it.

- 1,361
- 2
- 8
- 5
-
1I had the same problem. No need for the 2>&1 so I can keep the output and connection log separate. Thanks roadnottaken. – quornian Oct 07 '12 at 00:49
-
2The quotes did the trick for me. I felt like curl was executing in other threads. Thanks a lot ! – vdolez Mar 25 '15 at 10:55
-
1Had to search on the web for five minutes before you saved my evening :) – Shautieh Jul 09 '16 at 14:30
-
Jesus, what a terrible bug in curl—at a very least it should fail or give a warning. Your 2012 answer helped me in 2018. Took me 30 mins to solve this until I came across your answer. Thank you! – Mauvis Ledford Jun 08 '18 at 18:57
-
@MauvisLedford I don't think cURL can even know that it's being backgrounded; it's the shell that's doing it. In a Bash shell, run `python -c "import sys;print(sys.argv)" somearg=ls` and you'll see Python (easy way to see the argv the process gets) print out `['-c', 'somearg=ls']`. Replace the `=` with a `&` and the process will never see the `&` because Bash split the commands up – Nick T Aug 21 '19 at 18:11
-
1@MauvisLedford not a curl bug, but syntax of your shell. Use quotes if are unsure. – qwr Dec 27 '19 at 01:18
The answer above didn't work for me, what did eventually was this syntax:
curl https://${URL} &> /dev/stdout | tee -a ${LOG}
tee puts the output on the screen, but also appends it to my log.

- 4,140
- 3
- 27
- 36
If you need the output in a file you can use a redirect:
curl https://vi.stackexchange.com/ -vs >curl-output.txt 2>&1
Please be sure not to flip the >curl-output.txt
and 2>&1
, which will not work due to bash's redirection behavior.

- 3,224
- 1
- 29
- 41
Just my 2 cents. The below command should do the trick, as answered earlier
curl -vs google.com 2>&1
However if need to get the output to a file,
curl -vs google.com > out.txt 2>&1
should work.

- 36,626
- 12
- 31
- 42

- 121
- 1
- 2
-
2This adds no real value to the accepted answer of 2011. If anything, this could be a comment to that answer. – trincot Aug 18 '16 at 18:29
I found the same thing: curl by itself would print to STDOUT, but could not be piped into another program.
At first, I thought I had solved it by using xargs to echo the output first:
curl -s ... <url> | xargs -0 echo | ...
But then, as pointed out in the comments, it also works without the xargs part, so -s
(silent mode) is the key to preventing extraneous progress output to STDOUT:
curl -s ... <url> | perl -ne 'print $1 if /<sometag>([^<]+)/'
The above example grabs the simple <sometag>
content (containing no embedded tags) from the XML output of the curl statement.

- 9,103
- 6
- 53
- 57
-
2in your examples the 'xargs -0 echo |' is unnecessary. As long as you have 'curl -s' you can pipe the output to another program. – Ryan Horrisberger Apr 29 '13 at 20:40
The following worked for me:
Put your curl statement in a script named abc.sh
Now run:
sh abc.sh 1>stdout_output 2>stderr_output
You will get your curl's results in stdout_output
and the progress info in stderr_output
.

- 61
- 1
- 1
This simple example shows how to capture curl output, and use it in a bash script
test.sh
function main
{
\curl -vs 'http://google.com' 2>&1
# note: add -o /tmp/ignore.png if you want to ignore binary output, by saving it to a file.
}
# capture output of curl to a variable
OUT=$(main)
# search output for something using grep.
echo
echo "$OUT" | grep 302
echo
echo "$OUT" | grep title

- 1
- 1

- 66,836
- 64
- 257
- 336
Solution = curl -vs google.com 2>&1 | less
BUT, if you want to redirect the output to a file and the output is still on the screen, then the URL response contains a newline char \n which messed up your shell.
To avoit this put everything in a variable:
result=$(curl -v . . . . )

- 31
- 5