0

I have a WPF app where I need to handle DHAV (.dav) video stream on runtime from a Digital Video Recorder (DVR). I'm using an SDK that can be found here Dahua SDK search

SDK: General_NetSDK_Eng_Win64_IS_V3.052.0000002.0.R.201103

I need to handle every single frame from the video stream, convert it to a BitmapImage and then displays it in a WPF Image control. Something like: MJPEG Decoder

The problem is that I can't find any documentation on how to handle that data and the samples from the SDK doesn't show that either, instead they are built with WinForms and they only pass the Window Handle of the PictureBox's control to an exported DLL function and 'magically' shows the video stream:

[DllImport(LIBRARYNETSDK)]
public static extern IntPtr CLIENT_RealPlayEx(IntPtr lLoginID, int nChannelID, IntPtr hWnd, EM_RealPlayType rType);

OBS: 'hWnd' param is the Window Handle to display the video in.

The problem with this approach is that I don't have any control over the video stream.

I have tried many FFMPEG wrappers for .NET but they only parse the data if I first write it to disk and only then I can convert it to some type I can handle.

This is the callback function that is called contantly during the application's runtime with the data I need to handle:

    private void RealDataCallback(IntPtr lRealHandle, uint dwDataType, IntPtr pBuffer, uint dwBufSize, IntPtr param, IntPtr dwUser)
    {
        switch (dwDataType)
        {
            case 0: // original data
                break;
            case 1: // frame data
                HandleFrameData(lRealHandle, dwDataType, pBuffer, dwBufSize, param, dwUser);
                break;
            case 2: // yuv data
                break;
            case 3: // pcm audio data
                break;
        }
    }

    private void HandleFrameData(IntPtr lRealHandle, uint dwDataType, IntPtr pBuffer, uint dwBufSize, IntPtr param, IntPtr dwUser)
    {
        // The pBuffer parameter format is DHAV (.dav) 
        byte[] buff = new byte[dwBufSize];
        Marshal.Copy(pBuffer, buff, 0, (int)dwBufSize);

        using (var ms = new MemoryStream(buff))
        {
        }
    }

UPDATE

I'm able to convert the YUV data provided in the callback funcion to RGB but that is not the ideal solution. It would be so much better (and faster) if I can convert the original (.dav) data.

The RealDataCallback, in fact, only returns 1 frame per callback, but I don't know how to convert that frame to Bitmap. Any help would be appreciated.

Mateus Henrique
  • 188
  • 2
  • 10
  • You can try to use [YUV](https://en.wikipedia.org/wiki/YUV) data supplied to the callback. Converting it to RGB is quite easy. – Dark Daskin Mar 22 '21 at 05:26
  • Thanks for your suggestion. I was able to convert YUV[] to RGB[] but converting it from RGB[] to Bitmap doesn't seem to be so easy because I'm having some issues in the result of the images... But I'm going to open a separate question for them... – Mateus Henrique Mar 23 '21 at 16:20
  • The best way to mess with video frames in WPF would be with [FFmpeg.Autogen](https://github.com/Ruslan-B/FFmpeg.AutoGen) bindings for FFmpeg. I have already a WPF video player .NET library which includes snapshots that might be a quick solution for you [Flyleaf](https://github.com/SuRGeoNix/Flyleaf) – SuRGeoNix May 05 '21 at 08:30
  • Maybe that can help https://www.programmersought.com/article/39157248216/ – Simon Mourier May 06 '21 at 06:49
  • Two suggestions: 1) don't allocate your buffer every frame, you'll bring the garbage collector to its knees, and 2) use a YUV->RGB pixel shader to do the conversion, it'll literally be thousands of times faster on even modest hardware. – Mark Feldman May 07 '21 at 06:01
  • @MarkFeldman That was just an example. Like I said, I'm already doing this conversion of YUV > RGB. One problem (which I didn't mention in the post), is that the SDK only provides the YUV data if I pass the display window handle as a parameter to the function that starts the stream. That's why I wanted to handle the original data instead of the YUV. So I'll probably have to convert the .DAV > YUV somehow. – Mateus Henrique May 07 '21 at 13:30
  • Ah, ok then. I take it you've already looked at the [ffmpeg decoder code](https://ffmpeg.org/doxygen/trunk/dhav_8c_source.html)? – Mark Feldman May 07 '21 at 22:38
  • Not shure if I correctly understand your problem, but FFMPEG is powerful tool. So maybe it will be better to use it as usual system command call? FFMPEG can read stream from input https://stackoverflow.com/questions/45899585/pipe-input-in-to-ffmpeg-stdin So you can pipe `curl` output, for example. And, of course FFMPEG can convert input video to frame images https://stackoverflow.com/questions/10957412/fastest-way-to-extract-frames-using-ffmpeg – rzlvmp May 11 '21 at 11:21

0 Answers0