2

I'm using DearPyGUI to make a simple media player that can play video file (mp4, etc.) together with it's audio. The pre-requisite is that DearPyGUI is a must, however video feature will not exist until v2.0, which is still far in the future.

Currently, I can only render the frames using OpenCV library for Python, however, the problem is how can I play the audio as well as play it in sync with the output video frames?

For context, I'm quite new to Python, and I don't know much about video and audio streaming, but I've thought of some approaches to this problem by looking through help posts online (However, I still have no idea how I can implement any of these seamlessly):

  1. OpenCV for video frames, and audio??? some libraries like ffmpeg-python or miniaudio to play sound... (How...?)

  2. Extract video frames and audio here and then use the raw data to play it (How...?)

  3. This example here is pretty close to what I want excluding the playing video and audio part, but I have no idea where to go from there. The video stream and the audio stream are instances of ffmpeg.nodes.FilterableStream, and they appear to hold addresses to somewhere. (No idea...)

  4. Another very close idea is using ffpyplayer I was able to get the video frame. However, the below code yields a blueish purple color tint to the video, and the frame rate is very slow compared to original (So close...)

import time
import numpy as np
import cv2 as cv
from ffpyplayer.player import MediaPlayer


# https://github.com/Kazuhito00/Image-Processing-Node-Editor/blob/main/node_editor/util.py 
def cv2dpg(frame): 

    data = cv.resize(frame, (VIDEO_WIDTH, VIDEO_HEIGHT))
    data = np.flip(frame, 2)
    data = data.ravel()
    data = np.asfarray(data, dtype=np.float32)

    return np.true_divide(data, 255.0)


# https://stackoverflow.com/questions/59611075/how-would-i-go-about-playing-a-video-stream-with-ffpyplayer
# https://matham.github.io/ffpyplayer/examples.html#examples
def play_video(loaded_file_path):

    global player, is_playing
    player = MediaPlayer(loaded_file_path)

    while is_playing:

        frame, val = player.get_frame()

        if val == 'eof':
            is_playing = False
            break

        elif not frame:
            time.sleep(0.01)

        elif val != 'eof' and frame is not None:
            img, t = frame
            w = img.get_size()[0]
            h = img.get_size()[1]
            cv_mat = np.uint8(np.asarray(list(img.to_bytearray()[0])).reshape((h, w, 3)))
            texture_data = cv2dpg(cv_mat)
            dpg.set_value(VIDEO_CANVAS_TAG, texture_data)

    dpg.set_value(VIDEO_CANVAS_TAG, DEFAULT_VIDEO_TEXTURE)

I still need to do more research, but any pointer to somewhere good to start off (either handling raw data or using different libraries) would be greatly appreciated!

EDIT: For more context, I'm using raw texture like this example of DearPyGUI official documentation to render the video frames that were extracted in the while loop.

Vi Tiet
  • 29
  • 3
  • @Сергей Кох I'm sorry, but I think I have listed the problem quite clear. It is to play video with audio together with dearpygui. The details are to give everyone as much information as possible about what I have tried in order to solve this problem. If you have a solution, it is much appreciated! – Vi Tiet Mar 01 '23 at 15:17
  • for FPS, you need to control it yourself while reading the frame data per loop. The reason it may be inconsistent is that sometimes your loop takes more time to process then the desires fps. – xesf Mar 02 '23 at 07:08
  • for bluish colour, make sure both ffmpeg and opencv uses the same colour channel structures. In OpenCV it Blue and Res are swapped, BGR instead of RGB. – xesf Mar 02 '23 at 07:10

0 Answers0