1

I am not sure if it is Xamarin specific or a native Problem, too.

I am creating my ViewRenderer and in OnElementChanged my UIImageView.

        base.OnElementChanged(e);
        Foundation.NSError error;
        var session = AVFoundation.AVAudioSession.SharedInstance();
        session.SetCategory(AVFoundation.AVAudioSession.CategoryPlayAndRecord, out error);

        if (error != null)
        {
            ClientLogger.Instance.Log("Error im MediaViewRenderer creating AV session, error code: " + error.Code, ClientLogger.LogLevel.Error);
        }

        //_control = e.NewElement as CustomMediaView;
        UIKit.UIImageView surface = new UIKit.UIImageView();
        if  (surface != null)
        {
            this.SetNativeControl(surface);

I create my videolayer if it is null and set bound and Frames each time I render:

            if (_surface != null)
            {
                if (_videoLayer == null && IsRunning)
                {
                    _videoLayer = new AVSampleBufferDisplayLayer();
                    _videoLayer.VideoGravity = AVLayerVideoGravity.ResizeAspect.ToString();

                    _timeBase = new CMTimebase(CMClock.HostTimeClock);

                    _videoLayer.ControlTimebase = _timeBase;
                    _videoLayer.ControlTimebase.Time = CMTime.Zero;
                    _videoLayer.ControlTimebase.Rate = 1.0;

                    _surface.Layer.AddSublayer(_videoLayer);
                }

                if (_videoLayer != null)
                {
                    //if (_videoLayer.VisibleRect == null || _videoLayer.VisibleRect.Height == 0 || _videoLayer.VisibleRect.Width == 0)
                    //    ClientLogger.Instance.Log("Error  iOS H264Decoder rect", ClientLogger.LogLevel.Error);
                    _videoLayer.Frame = _surface.Frame;
                    _videoLayer.Bounds = _surface.Bounds;
                }

I receive my RTP stream and decode and Display my Video like it is descriped here:

How to use VideoToolbox to decompress H.264 video stream

If I want to stop the Video, I set the videolayer to null, later the surafce too.

            _videoLayer.Flush();
            _videoLayer.Dispose();
            _videoLayer = null;

                _surface.Dispose();
                _surface = null;

That works great and gives me a nice H264 Video for around 15 times.

And after that it Shows a blank Background only. No Video visible. The Decoder works fine and seems to render, Surface and videolayer are not null.

There seems to be no Memory hole or at least not of the size it could be a Problem.

Happens on both iOS 9 and 10. I think there is something wrong with the videolayer?

Any idea why it works around 15 times only?

Thanks a lot for some help or ideas!

Community
  • 1
  • 1
user739611
  • 21
  • 2

1 Answers1

0

Since you don't provide all of the code for your "stop the video" process, I'm going to assume that you are not calling the removeFromSuperlayer method of your videoLayer property and you are not calling the removeFromSuperview method of your surface property?

This will result in those objects still being present in the view hierarchy and the layer tree, and very likely still holding onto lower-level VT resources. You need to remove all references to those objects by removing them from the view hierarchy and layer tree.

Chris Edgington
  • 1,208
  • 12
  • 11
  • Welcome to Stackoverflow! This does not provide an answer to the question. Once you have sufficient reputation, you will be able to [comment](http://stackoverflow.com/help/privileges/comment) on any post. If you could pinpoint the reason why the asker does not use `removeFromSuperlayer()` or `removeFromSuperview()` properly, your answer would become interesting. – francis Jan 30 '17 at 20:23