9

This is my first question so please let me know if I missed anything!

Using Android API 16's new Media Codec implementation to try and decode a video so that I can send frames to be applied as a texture (the texture part is already done). So I have come up with the following code with some help off stack but in runOutputBuffer() my outputBufIndex is coming back -1 (or in an infinite loop as I have provided -1 as a timeout) can anyone help with this, and/or provide any advice on where to go from there?

Thanks for your help and here is my code:

public MediaDecoder( BPRenderView bpview )
{

    surface = bpview;
    extractor = new MediaExtractor( );
    extractor.setDataSource( filePath );
    format = extractor.getTrackFormat( 0 );
    mime = format.getString( MediaFormat.KEY_MIME );
    createDecoder( );
    runInputBuffer( );

}

public void createDecoder( )
{

    codec = MediaCodec.createDecoderByType( "video/avc" );
    // format =extractor.getTrackFormat( 0 );
    Log.d( LOG_TAG, "Track Format: " + mime );
    // format.setInteger( MediaFormat.KEY_BIT_RATE, 125000 );
    // format.setInteger( MediaFormat.KEY_FRAME_RATE, 15 );
    // format.setInteger( MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar );
    // format.setInteger( MediaFormat.KEY_I_FRAME_INTERVAL, 5 );
    codec.configure( format, null, null, 0 );
    codec.start( );

    codecInputBuffers = codec.getInputBuffers( );
    codecOutputBuffers = codec.getOutputBuffers( );
    extractor.selectTrack( 0 );
}

public void runInputBuffer( )
{
    // This should take in the entire video and put it in the input buffer
    int inputBufIndex = codec.dequeueInputBuffer( -1 );
    if( inputBufIndex >= 0 )
    {
        ByteBuffer dstBuf = codecInputBuffers[ inputBufIndex ];

        int sampleSize = extractor.readSampleData( dstBuf, 0 );
        Log.d( "Sample Size", String.valueOf( sampleSize ) );
        long presentationTimeUs = 0;
        if( sampleSize < 0 )
        {
            sawInputEOS = true;
            sampleSize = 0;
        }
        else
        {
            presentationTimeUs = extractor.getSampleTime( );
        }
        Log.d( LOG_TAG, "Input Buffer" );
        Log.d( "InputBufIndex:", String.valueOf( inputBufIndex ) );
        Log.d( "PresentationTimeUS", String.valueOf( presentationTimeUs ) );
        codec.queueInputBuffer( inputBufIndex, 0, // offset
                sampleSize, presentationTimeUs, sawInputEOS ? MediaCodec.BUFFER_FLAG_END_OF_STREAM : 0 );
        if( !sawInputEOS )
        {
            Log.d( "Extractor", " Advancing" );
            extractor.advance( );

        }
    }
    runOutputBuffer( );
}

public void runOutputBuffer( )
{
    BufferInfo info = new BufferInfo( );

    final int res = codec.dequeueOutputBuffer( info, -1 );

    Log.d( "RES: ", String.valueOf( res ) );
    if( res >= 0 )
    {
        int outputBufIndex = res;
        ByteBuffer buf = codecOutputBuffers[ outputBufIndex ];
        final byte[ ] chunk = new byte[ info.size ];
        buf.get( chunk ); // Read the buffer all at once
        buf.clear( ); // ** MUST DO!!! OTHERWISE THE NEXT TIME YOU GET THIS SAME BUFFER BAD THINGS WILL HAPPEN

        if( chunk.length > 0 )
        {
            Log.d( "Chunk: ", String.valueOf( chunk.length ) );

            surface.setTexture( chunk, 320, 240 );

            // mAudioTrack.write( chunk, 0, chunk.length );
            // do the things
        }
        codec.releaseOutputBuffer( outputBufIndex, false /* render */);

        if( ( info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM ) != 0 )
        {
            sawOutputEOS = true;
        }
    }
    else if( res == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED )
    {
        codecOutputBuffers = codec.getOutputBuffers( );
    }
    else if( res == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED )
    {
        final MediaFormat oformat = codec.getOutputFormat( );
        Log.d( LOG_TAG, "Output format has changed to " + oformat );
        // mAudioTrack.setPlaybackRate( oformat.getInteger( MediaFormat.KEY_SAMPLE_RATE ) );
    }

}

}

andr
  • 15,970
  • 10
  • 45
  • 59
James
  • 93
  • 1
  • 1
  • 5

1 Answers1

9

James, welcome to Stack Overflow (as a questioner)!

I have tried to toy with the MediaCodec class, it is terribly limited and poorly documented. However, check out this pretty solid post (and his linked github) by Cedric Fung. His github project should just work out of the box on an API-17 (JellyBean 4.2)+ device.

I'm sure you can determine what you need to change from there, although as I alluded to before, you have limited flexibility here with the current level of the API.

Regarding your specific problem, I think you are locking the UI with your media decoder calls, which is not recommended, You should be taking a threaded approach, and rather than setting -1 as your timeout, have a timeout of say 10000 and allow it to be called multiple times until it is active.

Hope this helps (although it has been months since you asked the question)!

kOrc
  • 416
  • 5
  • 11
  • Hi Korc Thanks for your answer. Yeah I have it fixed now, and it works very well, but theres more than a few problems with the class, particularly, as you say, with its flexibility. Fortunately our app is primarily for use on the S3 but if you were to use this class across several devices (and API's) it's more or less useless given the amount of codecs you have to allow for, especially when you try to do some post processing on the data and you don't know what buffer (YUV420 etc) you will get back until you try it on the device! Madness! – James Mar 27 '13 at 15:59
  • Yup! For post-processing (one of the reasons I was exploring the MediaCodec class), a solution I got to work was to write the buffer straight to a surface (similar to Cedric's code), except to use a `TextureView` instead of a `SurfaceView` (I can direct you to code that lets you do that). Once you have that, you can use the TextureView's nifty "getBitmap" operation to fill an Android `Bitmap` object -- which has a known format that you can set, normally `ARGB_8888` -- which you can operate on. It's not as efficient or elegant, but can get the job done. – kOrc Apr 02 '13 at 21:51
  • 2
    MediaCodec has been improved a bit in 4.3 (API 18). Some sample code is linked from here: http://bigflake.com/mediacodec/ – fadden Jul 24 '13 at 19:27
  • kOrc can you please provide a link to the example that uses TextureView? – Nativ Nov 03 '13 at 14:51
  • 1
    Some additional examples, in an API 18 SDK app (including a video player that uses TextureView): https://github.com/google/grafika – fadden Dec 22 '13 at 17:11