1

I'm trying to read the frames in the .bag files with pyrealsense2. I followed the read_bag_example by Intel. Here's the full sample that the code that I'm using.

import numpy as np
import pyrealsense2 as rs
import os
import time
import cv2

i = 0
try:
    config = rs.config()
    rs.config.enable_device_from_file(config, "D:/TEST/test_4.bag", repeat_playback=False)
    pipeline = rs.pipeline()
    pipeline.start(config)

    while True:
        frames = pipeline.wait_for_frames()
        depth_frame = frames.get_depth_frame()
        if not depth_frame:
            continue
        depth_image = np.asanyarray(depth_frame.get_data())

        color_image = cv2.applyColorMap(cv2.convertScaleAbs(depth_image, alpha=0.03), cv2.COLORMAP_JET)

        cv2.imwrite("D:/TEST/image/" + str(i) + ".png", color_image)
        i += 1
finally:
    pass

The code is working. However I checked the number of frames via realsense-viewer and its output is 890 frames. However, the output of this code always changes in the range of 500-770 and raise the error:

RuntimeError: Frame didn't arrived within 5000

I searched lots of hours but I was not able to find a solution that would resolve my problem.

I'm also using

  • Intel Firmware version - 5.11.15.0
  • Python - 3.6.8
  • pyrealsense2 - 2.24.0.965
  • D435 with 848x480, 90 FPS images

I could add more information if you need. Any help or other suggestions would greatly appriciated!

Physicing
  • 532
  • 5
  • 17

1 Answers1

2

Problem is about the playback time of the pyrealsense2. Modules automatically assign it as if they are real-time. Setting a profile, and setting playback time resolved the problem. There is a sample code that works with the 848x480-90FPS below.

i = 0
try:
    config = rs.config()
    rs.config.enable_device_from_file(config, "D:/TEST/test_4.bag", repeat_playback=False)
    pipeline = rs.pipeline()
    profile = pipeline.start(config)
    playback = profile.get_device().as_playback()
    playback.set_real_time(False)

    while True:

        frames = pipeline.wait_for_frames()
        playback.pause()
        depth_frame = frames.get_depth_frame()
        if not depth_frame:
            continue
        depth_image = np.asanyarray(depth_frame.get_data())

        color_image = cv2.applyColorMap(cv2.convertScaleAbs(depth_image, alpha=0.03), cv2.COLORMAP_JET)
        cv2.imwrite("D:/TEST/image/" + str(i) + ".png", color_image)
        i += 1
        playback.resume()

except RuntimeError:
    print("There are no more frames left in the .bag file!")


finally:
    pass

As could be seen above, while loop changed slightly in order to ensure that the gathered frame first processed before taking a new frame with the playback.pause() and playback.resume().

TL;DR:

You should set playback.set_real_time(False) if you are getting inconsistent number of frames in a .bag file.

Physicing
  • 532
  • 5
  • 17
  • Thanks for this, just what I was looking for. Surprisingly, using `playback.pause()` and `playback.resume()` breaks my code(only reads one frame), and I cat get by just using `playback.set_real_time(False)`. BTW where in the real-sense viewer can I see the number of frames in the bag file? – Rafay Khan Feb 22 '20 at 17:41
  • Actually, I could not find something that shows the number of frames in realsense-viewer. While I'm using the pyrealsense, I have experienced that one code may not work in some PC's. Maybe that's why you are having problem with pause and resume. One other reason might be your `playback.pause()` and `playback.resume()` are not inside the same while loop. – Physicing Feb 24 '20 at 08:24
  • thanks for replying. You're right there is no way to see the number of frames in a bag file. For that we need to use `rs-rosbag-inspector`. – Rafay Khan Feb 27 '20 at 07:26
  • I'll see my code if the `.pause()` and `.resume()` are in the same code block – Rafay Khan Feb 27 '20 at 07:38