0

I'm very new to gstreamer and rust and am trying to render a video made from sections of other videos. Based on the docs, gstreamer-rs examples, and this question about doing the same thing in python, I think my code looks pretty good, but throws errors.

This is my code:

use gstreamer as gst;
use gstreamer::{ElementExt, ElementExtManual, GstObjectExt};
use gstreamer_editing_services as ges;
use gstreamer_editing_services::{GESPipelineExt, LayerExt, TimelineExt};
use gstreamer_pbutils as gst_pbutils;
use gstreamer_pbutils::{EncodingProfileBuilder};

pub fn clip_video() {
    match gst::init() {
        Err(e) => eprintln!("{:?}", e),
        _ => (),
    }
    match ges::init() {
        Err(e) => eprintln!("{:?}", e),
        _ => (),
    }

    let timeline = ges::Timeline::new_audio_video();
    let layer = timeline.append_layer();

    let pipeline = ges::Pipeline::new();
    match pipeline.set_timeline(&timeline) {
        Err(e) => eprintln!("{:?}", e),
        _ => (),
    }

    let video_profile = gst_pbutils::EncodingVideoProfileBuilder::new()
        .name("h.264")
        .description("h.264-profile")
        .format(&gst::caps::Caps::new_simple("video/x-h264", &[]))
        .build()
        .unwrap();

    let audio_profile = gst_pbutils::EncodingAudioProfileBuilder::new()
        .name("mp3")
        .description("mp3-profile")
        .format(&gst::caps::Caps::new_simple(
            "audio/mpeg",
            &[("mpegversion", &"1"), ("layer", &"3")],
        ))
        .build()
        .unwrap();

    let contianer_profile = gst_pbutils::EncodingContainerProfileBuilder::new()
        .name("default-mp4-profile")
        .description("mp4-with-h.264-mp3")
        .format(&gst::caps::Caps::new_simple(
            "video/quicktime",
            &[("variant", &"iso")],
        ))
        .enabled(true)
        .add_profile(&video_profile)
        .add_profile(&audio_profile)
        .build()
        .unwrap();

    let asset = ges::UriClipAsset::request_sync("file:///home/ryan/repos/auto-highlighter-processing-service/input/test-video.mp4").expect("Failed to create asset");

    match layer.add_asset(
        &asset,
        0 * gst::SECOND,
        10 * gst::SECOND,
        10 * gst::SECOND,
        ges::TrackType::CUSTOM,
    ) {
        Err(e) => eprintln!("{:?}", e),
        _ => (),
    }

    match pipeline.set_render_settings("file:///home/ryan/repos/auto-highlighter-processing-service/output/test-video.mp4", &contianer_profile){
        Err(e) => eprintln!("{:?}", e),
        _ => (),
    }

    match pipeline.set_mode(ges::PipelineFlags::RENDER) {
        Err(e) => eprintln!("{:?}", e),
        _ => (),
    }

    match pipeline.set_state(gst::State::Playing) {
        Err(e) => eprintln!("{:?}", e),
        _ => (),
    }

    let bus = pipeline.get_bus().unwrap();

    for msg in bus.iter_timed(gst::CLOCK_TIME_NONE) {
        use gst::MessageView;

        match msg.view() {
            MessageView::Eos(..) => break,
            MessageView::Error(err) => {
                println!(
                    "Error from {:?}: {} ({:?})",
                    err.get_src().map(|s| s.get_path_string()),
                    err.get_error(),
                    err.get_debug()
                );
                break;
            }
            _ => (),
        }
    }
}

The errors that I am getting:

BoolError { message: "Failed to set render settings", filename: "/home/ryan/.cargo/registry/src/github.com-1ecc6299db9ec823/gstreamer-editing-services-0.16.5/src/auto/pipeline.rs", function: "gstreamer_editing_services::auto::pipeline", line: 228 }

StateChangeError

I'm struggling to find what to do about these errors or what the problem could be. From what I know I'm using the set_render_settings() and set_mode() functions correctly.

Ryan Callahan
  • 115
  • 1
  • 11

2 Answers2

1

I didn't try running your code, but one problem I found when reading was the following

        .format(&gst::caps::Caps::new_simple(
            "audio/mpeg",
            &[("mpegversion", &"1"), ("layer", &"3")],
        ))

The "mpegversion" and "layer" fields of the caps are not strings but integers. If you use them as such it should work (or at least work better)

        .format(&gst::caps::Caps::new_simple(
            "audio/mpeg",
            &[("mpegversion", &1i32), ("layer", &3i32)],
        ))

Everything else looks correct to me.

You can find more details about such errors by making use of the GStreamer debugging system. You can enable that via the GST_DEBUG environment variable, e.g. by setting that to 6.

Sebastian Dröge
  • 2,063
  • 11
  • 9
  • 1
    Thank you so much for the help, that was the problem prevent me from getting any output video. Now I'm trying to figure out why the outputted video is just ten seconds of black screen. btw thanks for all your work on the rust bindings! – Ryan Callahan May 19 '21 at 15:30
  • 1
    I'd ask that on the GStreamer mailing list. The GES developers are there and can probably immediately tell you :) I don't really know the GES API myself. – Sebastian Dröge May 19 '21 at 19:02
1

Although this answer is over a year late, I thought I'd post anyway, as examples in Rust for GES are sparse, with only a single (though good) example of applying a 'agingtv' effect on the gstreamer-rs repo. Additionally, the OP's sample code above will result in rendering 10 seconds of black video, and does not, as the OP mentioned, result in the desired output.

Using the example listed in the original question above:

On gstreamer-rs 0.19:

  1. Build video profile (note the caps now must be passed via builder() ):
let video_profile = gstreamer_pbutils::EncodingVideoProfile::builder(
    &gst::Caps::builder("video/x-h264").build(),
)
.name("video_profile")
.build();

  1. Build the Audio profile:
let audio_profile = gstreamer_pbutils::EncodingAudioProfile::builder(
    &gstreamer::Caps::new_simple("audio/x-aac", &[]),
)
.name("audio_profile")
.build();
  1. Build the Container profile:
let container_profile = gstreamer_pbutils::EncodingContainerProfile::builder(
    &gstreamer::Caps::new_simple("video/x-matroska", &[]),
)
.name("container_profile")
.add_profile(&audio_profile)
.add_profile(&video_profile)
.build();

Note: as an alternative you can build the whole encoding profile in one go from the DiscovererInfo if you ran the gst-discover on the media first. This will result in an output file very similar to the input file in it's encoding settings.

let encoding_profile =
gstreamer_pbutils::EncodingProfile::from_discoverer(&m_info.discover_info)?;

The following example will clip the video, add a transition, and merge in an additional clip to fade to:

let timeline = ges::Timeline::new_audio_video();
timeline.set_auto_transition(true);
let layer = timeline.append_layer();

let pipeline = ges::Pipeline::new();
pipeline.set_timeline(&timeline)?;

let audio_profile = gstreamer_pbutils::EncodingAudioProfile::builder(
    &gstreamer::Caps::new_simple("audio/x-aac", &[]),
)
.name("audio_profile")
.build();

let video_profile = gstreamer_pbutils::EncodingVideoProfile::builder(
    &gst::Caps::builder("video/x-h264").build(),
)
.name("video_profile")
.build();

let container_profile = gstreamer_pbutils::EncodingContainerProfile::builder(
    &gstreamer::Caps::new_simple("video/x-matroska", &[]),
)
.name("container_profile")
.add_profile(&audio_profile)
.add_profile(&video_profile)
.build();

/* alternatively 
let encoding_profile = gstreamer_pbutils::EncodingProfile::from_discoverer(&m_info.discover_info)?;
*/

/* original video */
let clip = ges::UriClip::new("file:///home/ryan/repos/auto-highlighter-processing-service/input/test-video.mp4")?;
layer.add_clip(&clip)?;
clip.set_inpoint(gst::ClockTime::from_seconds(0));
clip.set_duration(gst::ClockTime::from_seconds(10));

/* video to transition to with a fade */
let clip_transition_to = ges::UriClip::new("/some/2/second/video/file.mp4")?;
clip_transition_to.set_start(gst::ClockTime::from_seconds(9)); //this should overlap the original video clip, but not completely
clip_transition_to.set_inpoint(gst::ClockTime::from_seconds(0));
clip_transition_to.set_duration(gst::ClockTime::from_seconds(2));
layer.add_clip(&clip_transition_to)?;

pipeline.set_render_settings("file:///home/ryan/repos/auto-highlighter-processing-service/output/test-video.mp4", &container_profile)?; //or &encoding_profile
pipeline.set_mode(ges::PipelineFlags::RENDER)?;

pipeline.set_state(gst::State::Playing)?;
let bus = pipeline
    .bus()
    .expect("Pipeline without bus. Shouldn't happen!");

for msg in bus.iter_timed(gst::ClockTime::NONE) {
    use gst::MessageView;

    match msg.view() {
        MessageView::Eos(..) => break,
        MessageView::Error(err) => {
            pipeline.set_state(gst::State::Null)?;

            match err.details() {
                Some(details) if details.name() == "error-details" => details
                    .get::<&ErrorValue>("error")
                    .unwrap()
                    .clone()
                    .0
                    .lock()
                    .unwrap()
                    .take()
                    .map(Result::Err)
                    .expect("error-details message without actual error"),
                _ => Err({
                    let err_src = msg
                        .src()
                        .map(|s| String::from(s.path_string()))
                        .unwrap_or_else(|| String::from("None"));
                    log!(
                        Level::Error,
                        "A GStreamer Error was Encountered {}",
                        &err_src
                    );
                    ErrorMessage {
                        src: err_src,
                        error: err.error().to_string(),
                        debug: err.debug(),
                        source: err.error(),
                    }
                    .into()
                }),
            }?;
        }
        MessageView::StateChanged(state_changed) => {
            // if state_changed.src().map(|s| s == decodebin).unwrap_or(false)
            //     && state_changed.current() == gst::State::Playing
            // {
            //     // Generate a dot graph of the pipeline to GST_DEBUG_DUMP_DOT_DIR if defined
            //     let bin_ref = decodebin.downcast_ref::<gst::Bin>().unwrap();
            //     bin_ref.debug_to_dot_file(gst::DebugGraphDetails::all(), "PLAYING");
            // }
            let msg = format!(
                "State changed from {:?}: {:?} -> {:?} ({:?})",
                state_changed.src().map(|s| s.path_string()),
                state_changed.old(),
                state_changed.current(),
                state_changed.pending()
            );
            log!(Level::Debug, "{}", msg)
        }
        _ => (),
    }
}
let log_msg = "Play run complete, changing state...";
log!(Level::Info, "{}", &log_msg);
pipeline.set_state(gst::State::Null)?;

The result will be a 10 second video, with a fade out to a 2 second video (e.g. something that says "the end", etc).

I hope this helps someone. It took a some reading and research to achieve the desired effect, and hopefully this will help someone else on the same journey.

gstreamer is excellent software and, although it's been a bit of work to get things functional, it's been great to work with the rust bindings.

Chris Andrew
  • 103
  • 6