0

I'm trying to decode a video in real time (30 fps) and display /modify it with OpenGL. On an iPod touch, if I decode a video that I took with the camera, decoding a frame can take over 1s, while 30 fps should be 0.03s max. Thus the result is not very good..

Is it possible to achieve that with AVAssetReader ? For example Instagram applies filters (I think GLSL shaders) in real time on a video, and they can even navigate in the video. Instagram works fine on the ipod touch.

The code to decode can be found in the answer here : Best way to access all movie frames in iOS

And more specifically here : Hardware accelerated h.264 decoding to texture, overlay or similar in iOS

Thank you in advance

Community
  • 1
  • 1
Xys
  • 8,486
  • 2
  • 38
  • 56

2 Answers2

0

Due to the very limited information you provided, I have to assume that your video sequence are compressed in the format of YUV and you set settings of AVAssetReader with other format like kCVPixelFormatType_32BGRA which forces iOS use hardware acceleration to convert colour space for you, then you feel it slowly. I suggest that no settings to set, just use its original pixel format.

0

Actually my app was just doing too much work on the CPU, I had another process analyzing images. When I removed it, the decoding was really fast.

Xys
  • 8,486
  • 2
  • 38
  • 56