I am developing a library for video/photo processing (add filters like Instagram/Snapchat). So far the core features work very well.
However, I am finding the processing of a video (re-encoding an input video) to be hugely frustrating. There seem to be a number of edge cases and device specific issues that prevent the library from working 100% of the time.
I would like to know how to select/create a MediaFormat that will work on a device.
Currently, I'm setting up the MediaFormat that will be used to encode a video as follows:
// assume that "extractor" is a media extractor wrapper, which holds a
// reference to the MediaFormat of the input video
fun getOutputVideoFormat(): MediaFormat {
val mimeType = MediaFormat.MIMETYPE_VIDEO_H263
var width = -1
var height = -1
var frameRate = 30
var bitrate = 10_000_000
val colorFormat = MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface
if (extractor.videoFormat.containsKey(MediaFormat.KEY_WIDTH)) {
width = extractor.videoFormat.getInteger(MediaFormat.KEY_WIDTH)
}
if (extractor.videoFormat.containsKey(MediaFormat.KEY_HEIGHT)) {
height = extractor.videoFormat.getInteger(MediaFormat.KEY_HEIGHT)
}
if(extractor.videoFormat.containsKey(MediaFormat.KEY_FRAME_RATE)){
frameRate = extractor.videoFormat.getInteger(MediaFormat.KEY_FRAME_RATE)
}
if(extractor.videoFormat.containsKey(MediaFormat.KEY_BIT_RATE)){
bitrate = extractor.videoFormat.getInteger(MediaFormat.KEY_BIT_RATE)
}
val format = MediaFormat.createVideoFormat(mimeType, width, height)
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat)
format.setInteger(MediaFormat.KEY_BIT_RATE, bitrate)
format.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate)
format.setInteger(MediaFormat.KEY_CAPTURE_RATE, frameRate)
// prevent crash on some Samsung devices
// http://stackoverflow.com/questions/21284874/illegal-state-exception-when-calling-mediacodec-configure?answertab=votes#tab-top
format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, width * height)
format.setInteger(MediaFormat.KEY_MAX_WIDTH, width)
format.setInteger(MediaFormat.KEY_MAX_HEIGHT, height)
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 0)
return format
}
So far this works on all major devices that I have tested with, but there are some devices like the Samsung A5 that have been reported to fail silently using this format, and simply create corrupted output videos using an input video that works correctly on all other devices.
How can I tell if a MediaFormat will actually succeed on a given device?
The only logs I have from the Samsung A5 device indicate that when the MediaCodec sends through the "INFO_OUTPUT_FORMAT_CHANGED" signal, the following media format is returned:
csd-1=java.nio.ByteArrayBuffer[position=0,limit=8,capacity=8],
mime=video/avc,
frame-rate=30,
remained_resource=2549760,
height=480,
width=480,
max_capacity=3010560, what=1869968451,
bitrate=10000000,
csd-0=java.nio.ByteArrayBuffer[position=0,limit=17,capacity=17]
This format seems invalid to me, given the fact the input video has a resolution of 1280x720