4

A while ago I askedthis question which received an answer.

I have implemented an intermediary Surface as the answer suggested, but now I've run into another problem. At some points in time during my application, my VirtualDisplay can change resolution. So, I'd like to also update the size of my intermediary Surface to match the change in resolution of the VirtualDisplay. I was hoping this would be a simple call to setDefaultBufferSize on the Surface's underlying SurfaceTexture, but that doesn't appear to work.

I've poked around at releasing my intermediary Surface and SurfaceTexture and making new ones, but then I have to set the output surface for the VirtualDisplay to be null and do some other synchronization steps which I'd like to avoid if possible.

Is there a way to dynamically update the size of a Surface/SurfaceTexture after creation?

UPDATE:

I've tried calling VirtualDisplay.setSurface(null) along with VirtualDisplay.resize(newSize.width, newSize.height) and then sending a message to the thread which handles the callbacks for the intermediary SurfaceTextureto resize the texture via setDefaultBufferSize and then having the main thread poll the secondary thread until that set call is finished and then call VirtualDisplay.setSurface(surfaceFromSecondaryThread)

This works sometimes. Other times the texture is all green with a gray bar across it (which is also my glClearColor, not sure if that is related as seen here). Sometimes the current screen image is seen duplicates/smaller in my VirtualDisplay. So, it seems like a timing issue, but what timing I should wait for, I am unsure. The documentation for setDefaultBufferSize states:

For OpenGL ES, the EGLSurface should be destroyed (via eglDestroySurface), made not-current (via eglMakeCurrent), and then recreated (via eglCreateWindowSurface) to ensure that the new default size has taken effect.

The problem is that my code does not create an EGLSurface from the SurfaceTexture/Surface so, I have no way of destroying it. I'm assuming that the producer (VirtualDisplay) does, but there are no public APIs for me to get at the EGLSurface.

[UPDATE 2] So, when I see the problem (green screen with bar, corruption, perhaps because my glClearColor is green) if I do a glReadPixels before I call eglSwapBuffers to write to the Surface for the MediaCodec, I read green pixels. This tells me that it isn't a MediaCodec problem, that either the information written to the Surface from the VirtualDisplay is corrupt (and remains corrupt) or the conversion from YUV space to RGBA space when going from Surface -> OpenGL texture is broken somehow. I'm leaning towards there being a problem with VirtualDisplay

Community
  • 1
  • 1
EncodedNybble
  • 265
  • 2
  • 11
  • FWIW, I wouldn't expect `setDefaultBufferSize` to be necessary or useful. The producer sends frames to the consumer by handle, so unless the producer queries SurfaceTexture for its desired size, it's just going to pass the full frame without scaling. The size is useful for Canvas or GLES rendering, where you're just attaching a renderer to a Surface, but for output from MediaCodec or VirtualDisplay I'd expect the size to be determined by the producer based on the configuration of those objects. Also, YUV 0,0,0 is medium green, so I'd recommend a different clear color (red) to avoid confusion. – fadden Mar 31 '16 at 00:15
  • @fadden you seem to be correct. `setDefaultBufferSize` does not appear to be needed, *BUT* there seems to be a strange "transition" time when the `VirtualDisplay` is resized and the `OpenGL` texture is filled with some strange data. So, maybe I should prevent my `draw` loop from triggering in those times (I draw every X seconds, not every time a new frame is available since `onFrameAvailable` isn't called if nothing has changed on screen. – EncodedNybble Mar 31 '16 at 01:09
  • Unfortunately I have no explanation for the weirdness you're seeing with the virtual display. Changes in incoming buffer sizes are expected and should be handled gracefully by the underlying BufferQueue mechanism (this is why SurfaceView has a "surface changed" callback that passes width/height), but there's a lot of moving parts, and it's possible something is broken along the way. You shouldn't be getting junk textures from SurfaceTexture. VirtualDisplay is the newest component in the chain so I'd tend to blame that as well. – fadden Mar 31 '16 at 01:17
  • Also worth noting that I changed my `glClear` color to not stomp on YUV 0,0,0 and when the "badness" happens, the framebuffer is filled with my clear color + occasionally some random corruption (smaller, duplicate screen images, etc). So, it's not a YUV 0,0,0 error, I wonder why it's remotely related to the clear color..........I don't mind that there may be a bug, I just wish I knew a way to work around this as I don't want to ship this feature if it has this problem. I'll continue investigating to see if I can find a work around – EncodedNybble Mar 31 '16 at 01:31
  • Sounds like you're getting uninitialized data. On a chip with a tiled graphics architecture you typically get repeated chunks at regular intervals (see e.g. the image in http://stackoverflow.com/questions/33794384/). Does it clear out if you send a bunch of frames through? If so you could send N redundant frames after a size change and have the receiver drop N frames. (I don't really understand what's going on, so I'm just throwing out random ideas. You might be better off ignoring them. :-| ) – fadden Mar 31 '16 at 05:33
  • Thanks for the input @fadden. Unfortunately I can't figure out a way to not have the "uninitialized data" so, I've developed a workaround that no longer requires me to call `VirtualDisplay.resize`. Maybe I'll make a quick Android Studio project that shows the problem I was having. It only "clears out" if I call `resize` again and get lucky enough not to hit the same issue. It's pretty vexing. – EncodedNybble Mar 31 '16 at 23:47

0 Answers0