2

I've been porting over the following Test Android example to run in a simple Xamarin Android project.

https://bigflake.com/mediacodec/ExtractMpegFramesTest_egl14.java.txt

I'm running a video captured by the camera (on the same device) through this pipeline but the PNGs I'm getting out the other end are distorted, I assume due to the minefield of Android Camera color spaces.

Here are the images I'm getting running a Camera Video through the pipeline...

https://i.stack.imgur.com/Ufwtd.jpg

Its hard to tell, but it 'kinda' looks like it is a single line of the actual image, stretched across. But I honestly wouldn't want to bank on that being the issue as it could be a red herring.

However, when I run a 'normal' video that I grabbed online through the same pipeline, it works completely fine.

I used the first video found on here (the lego one) http://techslides.com/sample-webm-ogg-and-mp4-video-files-for-html5

And I get frames like this...

https://i.stack.imgur.com/YxkzK.jpg

Checking out some of the ffmpeg probe data of the video, both this and my camera video have the same pixel format (pix_fmt=yuv420p) but there are differences in color_range.

The video that works has,

color_range=tv
color_space=bt709
color_transfer=bt709
color_primaries=bt709

And the camera video just has...

color_range=unknown
color_space=unknown
color_transfer=unknown
color_primaries=unknown

The media format of the camera video appears to be in SemiPlanar YUV, the codec output gets updated to that at least. I get an OutputBuffersChanged message which sets the output buffer of the MediaCodec to the following,

{
    mime=video/raw,
    crop-top=0, 
    crop-right=639, 
    slice-height=480,
    color-format=21,
    height=480, 
    width=640, 
    what=1869968451, 
    crop-bottom=479, 
    crop-left=0, 
    stride=640
}

I can also point the codec output to a TextureView as opposed to OpenGL surface, and just grab the Bitmap that way (obviously slower) and these frames look fine. So maybe its the OpenGL display of the raw codec output? Does Android TextureView do its on decoding?

Note - The reason I'm looking into all this is I have a need to try and run some form of image processing on a raw camera feed at as close to 30fps as possible. Obviously, this is not possible some devices, but recording a video at 30fps and then processing the video after the fact is a possible workaround I'm investigating. I'd rather try and process the image in OpenGL for the improved speed than taking each frame as a Bitmap from the TextureView output.

In researching this I've seen someone else with pretty much the exact same issue here How to properly save frames from mp4 as png files using ExtractMpegFrames.java? although he didn't seem to have much luck finding out what might be going wrong.

EDIT - FFMpeg Probe outputs for both videos...

Video that works - https://justpaste.it/484ec .
Video that fails - https://justpaste.it/55in0 .

Iain Stanford
  • 606
  • 1
  • 6
  • 18
  • This isn't a colorspace issue - that would just result in different colors, not distorted image definition. This is related to pixel format i.e. data layout, although not being an Android/Java dev, can't help you directly. – Gyan May 25 '18 at 10:39
  • Cheers, helps with me narrowing it down! So both videos (apparantly) have the same pixel format, 'yuv420p' according too ffmpeg probe. Is there any other metadata paramter I should look at to determine whats really different between them? Here are the full ffmpeg probe results, Video that works - https://justpaste.it/484ec . Video that doesn't work - https://justpaste.it/55in0 . – Iain Stanford May 25 '18 at 10:46
  • The issue occurs at an earlier stage so ffprobe won't help. My guess is that the stride or component interleaving is wrongly communicated or set somewhere. – Gyan May 25 '18 at 11:01
  • The ffprobe was of the raw video file before being passed to anything. The video play fine on my pc in VLC or Quicktime or whatever. The video that fails was recorded by the camera on the phone but playsback 100% fine. – Iain Stanford May 25 '18 at 11:29
  • If you play it with ffplay, are any warnings issued? – Gyan May 25 '18 at 12:32
  • Doesn't look like it. This is the ffplay output for the video2 which was recorded by the camera but fails the pipeline... https://justpaste.it/69bx8 If i'm reading it right, looks pretty normal. (Also plays 100% fine). – Iain Stanford May 25 '18 at 12:58
  • This is the ffplay output for the video that works fine through the pipeline... https://justpaste.it/77ffm – Iain Stanford May 25 '18 at 13:00
  • Try with the results of `ffmpeg -i failed.mp4 -c copy remux.mp4` and `ffmpeg -i failed.mp4 -profile:v baseline rotated.mp4` – Gyan May 25 '18 at 13:10
  • Ok so interesting results. I'll double check these but both those commands ran fine (no errors) and produces 2 videos seemingly identical visually when playing back in VLC. When running on the phone through the pipeline, remux.mp4 came out the same with the corrupt horizontal lines. But rotated.mp4 came out looking correct! I need to double check everything to make sure but it seemed fine. – Iain Stanford May 25 '18 at 13:48
  • Ok, one last try: `ffmpeg -i failed.mp4 -c copy -metadata:s:v rotate=0 remux2.mp4` – Gyan May 25 '18 at 13:50
  • Ok. So that remux2 video also worked fine! It *is* rotated 90 degrees to the original video (this isn't an issue, I don't care about orientation). I assume the camera on this device is orientated 90 degrees. – Iain Stanford May 25 '18 at 14:01
  • Likely. It looks like your pipeline is expecting HxW but the video is stored as WxH. – Gyan May 25 '18 at 14:09
  • Yeah thanks for all your help! I've "fixed" it by setting the MediaFormat rotation to 0. I don't know why this is happening yet, the videos are recorded with a rotation of 90 in the video metadata but the encoder needs 0...so its odd but its certainly got me on the right track for sorting it! Cheers again. I'll add answer with this fix once i've investigated more if I can figure out *why* this is happening. – Iain Stanford May 25 '18 at 14:18
  • @IainStanford Just food for thought: On `Xamarin.Android` I pipe the camera (video) output to a ffmpeg process that is extracting `bmp` data (fastest compared to jpg/png) to an output pipe in order to produce on-the-fly thumbnail timelines, format conversions, etc.. due to keeping this process out of both of the Java/CIL VMs, I can keep the UI updated at a 30+fps (on armv7 w/neon) and 50/60fps on arm64... – SushiHangover May 25 '18 at 19:07

0 Answers0