I am using GStreamer (version 1.14.1) on Android and already have an implementation (in C++ code) that works on Windows and Linux.
I followed the Android tutorial from the GStreamer website to build my own gstreamer-android.so
library to use in Android Studio. In my C++ code I added
GST_PLUGIN_STATIC_DECLARE
and GST_PLUGIN_STATIC_REGISTER
macros to register the static plugins.
At runtime all GstElements can be created successfully. (I am using gst_element_factory_make()
calls to do that.)
I build the pipeline like this:
GstCaps* video_caps = gst_caps_new_full(
gst_structure_new("video/x-raw", "format", G_TYPE_STRING, "RGBA", NULL),
gst_structure_new("video/x-h264", "format", G_TYPE_STRING, "RGBA", NULL),
NULL);
g_object_set(m_app_sink, "emit-signals", TRUE, "caps", video_caps, NULL);
g_signal_connect(m_app_sink, "new-sample", G_CALLBACK(cb_new_sample), this);
gst_bin_add_many(GST_BIN(m_pipeline), source, m_decoder, m_video_flip, m_queue, m_video_convert, m_app_sink, NULL);
if (!gst_element_link_many(source, m_decoder, NULL)) {
gst_object_unref(m_pipeline);
return false;
}
if (!gst_element_link_many(m_video_flip, m_queue, m_video_convert, m_app_sink, NULL)) {
gst_object_unref(m_pipeline);
return false;
}
g_signal_connect(m_decoder, "pad-added", G_CALLBACK(cb_pad_added), this);
and later in the code start it with gst_element_set_state(m_pipeline, GST_STATE_PLAYING);
I am doing this with different elements as source
: filesrc
, udpsrc
, and videotestsrc
.
When I switch the pipeline into playing state, I expect to get the pad-added
callback from the decodebin
to be triggered and there I do the linking of the pad. All of this is already working on Windows and Linux for all 3 sources.
On Android, the callback only gets triggered for videotestsrc
, but not if I use one of the other sources.
But what am I missing? Why would it work for videotestsrc but not with files and udp streams? (Permissions settings in the Android app are set correctly and static plugins are registered.)
Does someone know a sample implementation on Android where the pipeline is build manually and with the callbacks? (So far, I could only find examples with playbin
, which I cannot use because I need to grab frames from the video stream.)
I am really out of ideas, why I don't get this callback. Any help and suggestion is much appreciated. Thanks!
Update:
After creating my own debug function (following this post), I can see that there is a problem with the encoder plugin for H.264.
2019-07-30 09:56:29.034 27551-27611/at.myapp.player E/GStreamer.cpp:: gstdecodebin2.c,gst_decode_bin_expose: error: no suitable plugins found:
Missing decoder: H.264 (Constrained Baseline Profile) (video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)4, profile=(string)constrained-baseline, codec_data=(buffer)0142c028ffe1001c6742c028db01e0089f97016a020202800000030080015f90078c197001000568ca8132c8, width=(int)1920, height=(int)1080, framerate=(fraction)25/1, pixel-aspect-ratio=(fraction)1/1)
But I don't know how to solve it yet. I though by adding GST_PLUGIN_STATIC_REGISTER(openh264);
this should be there, shouldn't it?