3

My Logitech C920 webcam provides a video stream encoded in h264. I'm using this "capture" tool to access the data:

So I can view live video:

/usr/local/bin/capture -d /dev/video0 -c 100000 -o | \
  gst-launch-1.0 -e filesrc location=/dev/fd/0 \
                    ! h264parse \
                    ! decodebin\
                    ! xvimagesink sync=false

...or record the stream as a raw h264 file:

/usr/local/bin/capture -d /dev/video0 -c 100000 -o | \
  gst-launch-0.10 -e filesrc location=/dev/fd/0 \
                     ! h264parse \
                     ! mp4mux \
                     ! filesink location=/tmp/video.mp4

...but I can't for the life of me figure out how to do both at the same time. Having a live feed on screen while recording can be useful sometimes, so I'd like to make this work. Spent hours and hours looking for a way to grab and screen simultaneously but no luck. No amount of messing around with tees and queues is helping.

Guess it would be a bonus to get ALSA audio (hw:2,0) into this as well, but I can get around that in an ugly hacky way. For now, I get this even though hw:2,0 is a valid input in Audacitu or arecord, for example:

Recording open error on device 'hw:2,0': No such file or directory
Recording open error on device 'plughw:2,0': No such file or directory

So to recap: would love to put those two video bits together, bonus if audio would work too. I feel like such a newbie.

Thanks in advance for any help you can provide.

edit: non-working code:

/usr/local/bin/capture -d /dev/video1 -c 100000 -o | \
     gst-launch-1.0 -e filesrc location=/dev/fd/0 ! tee name=myvid ! h264parse ! decodebin \
     ! xvimagesink sync=false myvid. ! queue ! mux. alsasrc device=plughw:2,0 ! \
     audio/x-raw,rate=44100,channels=1,depth=24 ! audioconvert ! queue ! mux. mp4mux \
     name=mux ! filesink location=/tmp/out.mp4 

...leads to this:

WARNING: erroneous pipeline: could not link queue1 to mux 

Edit: Tried umlaeute's suggestion, got a nearly empty video file and one frozen frame of live video. With/without audio made no difference after fixing two small errors in the audio-enabled code (double quotation mark typo, not encoding audio to anything compatible with MP4. Adding avenc_aac after audioconvert did that trick). Error output:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstAudioSrcClock
Redistribute latency...
ERROR: from element /GstPipeline:pipeline0/GstMP4Mux:mux: Could not multiplex stream.
Additional debug info:
gstqtmux.c(2530): gst_qt_mux_add_buffer (): /GstPipeline:pipeline0/GstMP4Mux:mux:
DTS method failed to re-order timestamps.
EOS on shutdown enabled -- waiting for EOS after Error
Waiting for EOS...
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2809): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc0:
streaming task paused, reason error (-5)

EDIT: Okay, umlaeute's corrected code works perfectly, but only if I'm using v4l2src instead of the convert tool. And for now, that means grabbing the MJPEG stream rather than the H264 one. No skin off my nose, though I guess I'd prefer a more modern workflow. So anyway, here's what actually works, outputting an MJPEG video file and a real-time "viewfinder". Not perfectly elegant but very workable. Thanks for all your help!

gst-launch-1.0 -e v4l2src device=/dev/video1 ! videorate ! 'image/jpeg, width=1280, height=720, framerate=24/1' ! tee name=myvid \    
      ! queue ! decodebin ! xvimagesink sync=false \     
      myvid. ! queue ! mux.video_0 \    
      alsasrc device="plughw:2,0" ! "audio/x-raw,rate=44100,channels=1,depth=24" ! audioconvert ! lamemp3enc ! queue ! mux.audio_0 \    
      avimux name=mux ! filesink location=/tmp/out.avi
Anders Bylund
  • 81
  • 1
  • 6
  • 1
    `tee` should work fine, please post a non-working pipeline – umläute Apr 16 '13 at 18:12
  • Right, each of the two parts works fine on its own. What I can't do is make them work together. Non-working example (my best shot) added above. – Anders Bylund Apr 16 '13 at 19:10
  • out of curiosity, why are you not using v4l2src instead of the capture tool? – ensonic Apr 17 '13 at 15:43
  • Great question. It's because I got the camera, went looking for Linux software that could handle it, and found the capture thing before learning about gstreamer. And I still don't know how to grab the h264-encoded data from the cam in gstreamer without that tool. Total newbie in gstreamer here, learning as I go. – Anders Bylund Apr 18 '13 at 10:18
  • Also, recent posts here at stackoverflow indicate that "capture" is still the way to go: http://stackoverflow.com/questions/15787967/capturing-h-264-stream-from-camera-with-gstreamer – Anders Bylund Apr 18 '13 at 14:51

1 Answers1

0

gstreamer is often a bit dumb when it comes to automatically combining multiple different streams (e.g. using mp4mux). in this case you should usually send a stream not only to a named element, but to a specific pad (using the elementname.padname notation; the element. notation is really just a shorthand for "any" pad in the named element).

also, it seems that you forgot the h264parse for the mp4muxer (if you look at the path the video takes, it really boils down to filesrc ! queue ! mp4mux which is probably a bit rough).

while i cannot test the pipeline, i guess something like the following should do the trick:

 /usr/local/bin/capture -d /dev/video1 -c 100000 -o | \
   gst-launch-1.0 -e filesrc location=/dev/fd/0 ! h264parse ! tee name=myvid \
     ! queue ! decodebin ! xvimagesink sync=false  \
     myvid. ! queue  ! mp4mux ! filesink location=/tmp/out.mp4

with audio it's probably more complicated, try something like this (obviously assuming that you can read audio using the alsasrc device="plughw:2,0" element)

 /usr/local/bin/capture -d /dev/video1 -c 100000 -o | \
   gst-launch-1.0 -e filesrc location=/dev/fd/0 ! h264parse ! tee name=myvid \
     ! queue ! decodebin ! xvimagesink sync=false  \
     myvid. ! queue ! mux.video_0 \
     alsasrc device="plughw:2,0" ! "audio/x-raw,rate=44100,channels=1,depth=24"" ! audioconvert ! queue ! mux.audio_0 \
     mp4mux name=mux ! filesink location=/tmp/out.mp4
umläute
  • 28,885
  • 9
  • 68
  • 122
  • Very promising, thanks for the suggestion. It doesn't quite work yet but maybe there's light at the end of the tunnel. The new code still errors out, producing a 48-byte file with rudimentary mp4 headers and one frame of live video. Relevant error output added in the last edit above. – Anders Bylund Apr 17 '13 at 11:05