1

I'm trying to implement to recored screenshot drawn by Open GL on iOS. Of course, I know it is possible to read using GLReadPixels.

That is described in apple's article in below. https://developer.apple.com/library/ios/qa/qa1704/_index.html

However GLReadPixel is slow and it blocks main loop.

Are there any better alternative way to save screenshot ???

It is okay to drop some frames to recored screen if it is busy but I don't want to affect frame rate to display.

Thanks!

ChenSmile
  • 3,401
  • 4
  • 39
  • 69
user3363732
  • 11
  • 1
  • 1

1 Answers1

0

There is a faster way introduced on iOS, try this post.

In any case the data retrieval from GPU is always slow but there are many cases where you can break this operation down to many smaller ones or even putting it to some background thread but this depends on the project you are working on: If you need to get the frames in real time like making a screenshot video there is not much you can do (unless dropping video FPS or resolution...), on the other hand if you are only making a complex screenshot and you want your scene to continue uninterrupted there might be a few ways.

On screenshot you could create another frame buffer and continue rendering on the new one (scene uninterrupted), then on the end of each draw frame bind original buffer and use read pixels but copy only a portion of the data: For instance copy a 4th of the original buffer each frame and you will get the whole image data in 4 frames and reducing this operation 4 times per frame. This can also be used making a video (copy half of the buffer each frame dropping video FPS to half).

Another way is to rather create a custom frame buffer with POT(power of two) dimensions and attaching a texture to it. This way you draw to texture instead of drawing it directly to the buffer. This way you can create another thread, another shared context on which you create screenshots using this texture while on main thread you draw it to the screen. This way you can control how much time the application spends on making a screenshot but note that using this for something like a video is very unlikely to look nice.

Community
  • 1
  • 1
Matic Oblak
  • 16,318
  • 3
  • 24
  • 43
  • Thanks. It seems CoreVideo needs to be linked. I'd like to save it as image and unfortunately it is not good to link it for us. – user3363732 Feb 28 '14 at 09:31
  • For 1st one to use CoreVideo, I investigated and tried but I cannot implement code. Do you have more specific code? In fact, I don't understand how to convert from data in RenderBuffer or FrameBuffer to CVImageBufferRef. Can you past actual code? – user3363732 Mar 10 '14 at 08:04
  • Also I tried 2nd solution. It doesn't work on the device. It worked on iOS simulator but it was stopped to update display on the device. What I did is that 1) crate new framebuffer, 2) attach current renderBuffer to new framebuffer, 3) bind original framebuffer, 4)start new thread and pass freamebuffer id, 5) set new context (sharable context) and getReadPixcels on new thread. – user3363732 Mar 10 '14 at 08:13
  • 1.: You create the full sized CV buffer then use CVPixelBufferGetBaseAddress to get the data pointer of the buffer. Afterwards you can use memcpy or anything else to copy the raw RGB data. Do not forget to lock and unlock the buffer in the process though, use CVPixelBufferLockBaseAddress – Matic Oblak Mar 10 '14 at 08:29
  • The second process might be a bit harder to implement. After you have checked all the buffer and texture binding on both threads are 100% good (also set current GL context) you need to understand that you will need some locking mechanism between threads. You might want to use double buffers where the background thread should request the swap to be made on the foreground thread, then the background thread is using the back buffer. – Matic Oblak Mar 10 '14 at 08:42
  • I just wanted to know how to copy current screen image before getting raw RGB data. If I understand correctly, it's possible to get data from CVPixelBufferRef but I need to draw data into CVOpenGLESTextureRef probably. I just have reference to UIView then probably I need to copy current render buffer to CVOpenGLESTextureRef probably but I don't know how I can do it. Please let me know. – user3363732 Mar 13 '14 at 04:05