5

Update: This looks like it's related to this: Image data from Android camera2 API flipped & squished on Galaxy S5 - I consider this as a bug since Nexus 5/6 works correctly and it makes no sense to need to obtain full sensor size and then cropping manually to reach the desired aspect ratio, might as well not using "supported" output sizes as well!

Problem:

  1. Get characteristics of a camera using Camera2 API, and extract output sizes suitable for a MediaCodec.class
  2. Create a MediaCodec input surface with one of the suitable camera output sizes. Feed the output to some MediaMuxer or whatever, to see the output.
  3. Start camera capture requests using the codec's created surface as the target.
  4. Codec output has the correct size. But the result differs by device:

    • Nexus 5/6: everything ok on Android 5/6.
    • Samsung tablet with Android 5.1: for some resolutions, the image is obviously stretched, indicating that the camera output resolution does not match the surface size. Becomes very obvious when starting to rotate the camera - image becomes more and more skewed since it's not aligned with the X/Y axes. For some other resolutions the output is OK. There is no pattern here related to either the size or the aspect ratio.

No problem, one would say. Maybe the surface is not created exactly at the specified width and height, or whatever (even if the output sizes were extracted specifically for a MediaCodec.class target).

So, I created an OpenGL context, generated a texture, a SurfaceTexture for it, set its default buffer size to the camera output size, and created a Surface using the texture. I won't go into the gory details of drawing that to a TextureView or back to the MediaCodec's EGL surface. The result is the same - the Camera2 capture requests outputs a distorted image only for some resolutions. Digging deeper: calling getTransformMatrix on the SurfaceTexture immediately after updateTexImage - the matrix is always the identity matrix, as expected.

So, the real problem here is that the camera is NOT capturing at the size of the provided target surface. The solution would thereby be to get the actual size the camera is capturing, and the rest is pure GL matrix transforms to draw correctly. But - HOW DO I GET THAT?

Note: using the old Camera API, with exactly the same "preview size" and the same surface as the target (either MediaCodec's or the custom one) - ALL IS FINE! But I can't use the old camera API, since it's both deprecated and also seems to have a max capture size of 1080p, while the Camera2 API goes beyond that, and I need to support 4k recording.

Community
  • 1
  • 1
Adrian Crețu
  • 878
  • 8
  • 17
  • were you able to find a fix? – Edmund Rojas Aug 07 '16 at 22:53
  • No. I learned to live with it as a BUG and move on. Google will always say its a Samsung issue, and Samsung doesn't care about obscure platform bugs or their Camera2 API implementation as long as people blindly buy their latest flagships. – Adrian Crețu Aug 09 '16 at 17:24
  • I wound up whitelisting samsung devices under Marshmellow to use camera1 instead, it seems like Samsung didnt bother to update its camera to work with camera2 until Marshmellow – Edmund Rojas Aug 09 '16 at 21:33
  • @AdrianCrețu hey can i know how did you created camera2 perview over openGl? i need help – Raut Darpan Jan 11 '17 at 10:34
  • @RautDarpan - I've generated an OpenGL texture and used that to create the SurfaceTexture/Surface fed into the camera capture request. The rest is usual OpenGL operations. ofcourse the details may get complicated if you're asking how to manipulate it further, but that's the core. – Adrian Crețu Jan 12 '17 at 16:52
  • @AdrianCrețu i just want to know how did you manage to draw over camera2 perview... i had tried but failed to do so. can you help me – Raut Darpan Jan 13 '17 at 05:32
  • @AdrianCrețu can you give me your mail id ? – Raut Darpan Jan 13 '17 at 06:07
  • @AdrianCrețu you here help – Raut Darpan Jan 17 '17 at 06:31
  • Possibly related: https://stackoverflow.com/questions/35709754/camera2-aspect-ratio-on-samsung – glenatron Dec 12 '18 at 00:47

2 Answers2

2

I encounter similar issue, model SM-A7009 with api level 21, legacy camera2 device.

The preview is stretched, surfaceTexture.setDefaultBufferSize not working, the framework will override these value when preview started.

The preview sizes reported from StreamConfigurationMap.getOutputSizes(SurfaceTexture.class) are not all supported.

Only three of them are supported.

$ adb shell dumpsys media.camera |grep preview-size
preferred-preview-size-for-video: 1920x1080
preview-size: 1440x1080
preview-size-values: 1920x1080,1440x1080,1280x720,1056x864,960x720,880x720,800x480,720x480,640x480,528x432,352x288,320x240,176x144

The system dump info list many of the preview sizes, after check all of them, I found only 1440x1080, 640x480, 320x240 are supported.

The supported preview sizes all have 1.33333 ratio. They have the same ratio reported from CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE.

So I thought it's a bug in some samsung devices with legacy camera2 api in API 21.

The solution is making these devices using deprecated camera API.

Hope it would be helpful for anyone reach here.

alijandro
  • 11,627
  • 2
  • 58
  • 74
  • 1
    As the accepted answer points out, it's a bug on vendor's part, that usually occurs when the camera needs to also output a *different* size (like a preview with other aspect ratio), messing up the scaling. It gets worse as some devices even crash with an exception inside the Camera2 worker thread (invalid parameters trying to crop with bad arguments). **This is definitely my #1 unsolvable crash I've ever encountered.** – Adrian Crețu Jan 12 '17 at 16:58
  • 2
    In my scenario, even the preview and image reader aspect ratios are the same, the preview buffer size set to `SurfaceTexture` still don't work. The camera2 framework still override with one of the size `1440x1080, 640x480, 320x240`. – alijandro Jan 13 '17 at 02:50
1

So yes, this is a bug on those Samsung devices.

Generally this happens when you ask for multiple different aspect ratios on output, and the device-specific camera code trips over itself on cropping and scaling all of them correctly. You may be able to avoid it by ensuring all requested sizes have the same aspect ratio.

The resolution is probably actually what you asked for - but it's been incorrectly scaled (you could test this with an ImageReader at the problematic size, where you get an explicit buffer you can poke at.)

We are adding additional testing to the Android compliance tests to try to ensure these kinds of stretched outputs don't continue to happen.

Eddy Talvala
  • 17,243
  • 2
  • 42
  • 47
  • @ArkadyGamza - how do you know what "camera chooses", since we're talking about a Surface after all? And 3264x2448 is the same aspect ratio as 640x480, so there's no skewing. Nexus 5 works fine in this regard for me. – Adrian Crețu Oct 07 '16 at 13:16
  • @EddyTalvala - so I understand this is a bug in AOSP (or samsung?) when outputting to more than one surface, with different output aspect ratios. Unfortunately, this is exactly what I'm doing, for preview+encode purposes. I think you're the first person that actually explained this right, so I'll mark it as an answer. – Adrian Crețu Oct 07 '16 at 13:27
  • It's not an AOSP bug; that'd be easier to fix. It's a bug with some device-specific camera implementations (AOSP just tells the camera hardware 'give us these X outputs at sizes A,B, and C'. Typically it's because under the hood, they only have some many image scaling units in their camera pipeline, so the mapping of how to do a given set of outputs can get complicated. And they missed testing some of the permutations. – Eddy Talvala Oct 07 '16 at 20:36