0

I am trying to use android ndk to develop simple decoder/player application.I created one project using android sdk and then i created a folder named jni in my project directory. Inside the jni directory i created one omx.cpp file and i want to write my own class inside this which inherits Android MediaSource from stagefright.I have also included stagefright header files in my project.I am loading libstagefright.so by using dlopen in my omx.cpp file.

the code i am using is as follows:

using android::sp;

namespace android
{

class ImageSource : public MediaSource {
public:
ImageSource(int width, int height, int colorFormat)
: mWidth(width),
  mHeight(height),
  mColorFormat(colorFormat)
 {

  }

public:
int mWidth;
int mHeight;
int mColorFormat;

 virtual status_t start(MetaData *params = NULL) {}

 virtual status_t stop() {}

 // Returns the format of the data output by this media source.
 virtual sp<MetaData> getFormat() {}

 virtual status_t read(
    MediaBuffer **buffer, const MediaSource::ReadOptions *options) {
 }

/*protected:
virtual ~ImageSource() {}*/
};



void Java_com_exampleomxvideodecoder_MainActivity(JNIEnv *env, jobject obj, jobject    surface)
{

    void *dlhandle;

    dlhandle = dlopen("d:\libstagefright.so", RTLD_NOW);
    if (dlhandle == NULL)  {
        printf("Service  Not Found:  %s\n", dlerror());
    }
    int width = 720;
    int height = 480;
    int colorFormat = 0;

    sp<MediaSource> img_source = new ImageSource(width, height, colorFormat);

    sp<MetaData> enc_meta = new MetaData;
   // enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_H263);
   // enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_MPEG4);
   enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_AVC);
   enc_meta->setInt32(kKeyWidth, width);
   enc_meta->setInt32(kKeyHeight, height);
   enc_meta->setInt32(kKeySampleRate, kFramerate);
   enc_meta->setInt32(kKeyBitRate, kVideoBitRate);
   enc_meta->setInt32(kKeyStride, width);
   enc_meta->setInt32(kKeySliceHeight, height);
   enc_meta->setInt32(kKeyIFramesInterval, kIFramesIntervalSec);
   enc_meta->setInt32(kKeyColorFormat, colorFormat);


   sp<MediaSource> encoder =
       OMXCodec::Create(
               client.interface(), enc_meta, true, image_source);

   sp<MPEG4Writer> writer = new MPEG4Writer("/sdcard/screenshot.mp4");
   writer->addSource(encoder);

   // you can add an audio source here if you want to encode audio as well

   sp<MediaSource> audioEncoder =
   OMXCodec::Create(client.interface(), encMetaAudio, true, audioSource);
   writer->addSource(audioEncoder);

   writer->setMaxFileDuration(kDurationUs);
   CHECK_EQ(OK, writer->start());
   while (!writer->reachedEOS()) {
       fprintf(stderr, ".");
       usleep(100000);
   }
   err = writer->stop();
}
}

I have following doubts:

1.In jni function is it okay if we create some class objects and use them to call functions of say MediaSource class or we have to create separate .cpp and .h files.If we use separate files how do we call/ref it from jni function.

2.Is this the right approach to make our own wrapper class which inherits from MediaSource class or is there any other way.

Basically i want to make an application which takes .mp4/.avi file,demux it separate audio/video,decode and render/play it using android stagefright and OpenMAX only.

If ffmpeg is suggested for source,demuxing then how to integrate it with android st agefright framework.

Regards

Ganesh
  • 5,880
  • 2
  • 36
  • 54
Mayank Agarwal
  • 447
  • 1
  • 7
  • 21
  • Please share some logs and your `Android.mk` or `makefile` and some sources that shows how you are employing `MediaSource` in your code. Based on these information, we can help you. – Ganesh Nov 17 '13 at 14:49

1 Answers1

0

To answer your first question, Yes it is possible to define a class in the same source file and instantiate the same in a function below. A best example which I feel could be very good example for such an implementation would be the DummySource of the recordVideo utility which can be found in cmds directory.

However, your file should include the MediaSource.h file either directly or indirectly as can be found in the aforementioned example too.

The second question is more of an implementation choice or religion. For some developers, defining a new class and inheriting from MediaSource might be the right way as you have tried in your example.

There is an alternate implementation where you can create the source and typecast into a MediaSoure strong pointer as shown in the example below.

<sp><MediaSource>   mVideoSource;
mVideoSource = new ImageSource(width, height, colorformat);

where ImageSource implements start and read methods. I feel recordVideo example above is a good reference.

Regarding the last paragraph, I will respond on your other query, but I feel there is a fundamental mismatch between your objective and code. The objective is to create a parser or MediaExtractor and a corresponding decoder, but the code above is instantiating an ImageSource which I presume gives YUV frames and creating an encoder as you are passing true for encoder creation.

I will also add further comments on the NDK possibilities on the other thread.

Ganesh
  • 5,880
  • 2
  • 36
  • 54
  • how to integrate omx codec component as hardware component in android code.Also is it possible to be able to query at application level whether framework is using omx software or hardware component as decoder. – Mayank Agarwal Feb 16 '14 at 12:20
  • @MayankAgarwal.. To add any `OMX` component, please check this thread: http://stackoverflow.com/questions/15334509/android-how-to-integrate-a-decoder-to-multimedia-framework/15343639#15343639 . When you refer to an application, can you please be more specific which application you are looking at? For ex: In case of example command line utilities like `stagefright`, there is a methodology to prefer the software codec as here:http://androidxref.com/4.4.2_r1/xref/frameworks/av/cmds/stagefright/stagefright.cpp#600 . Can you please post a different ques, if this doesn't answer your current query? – Ganesh Feb 16 '14 at 13:08
  • Currently i have android open source code for a multimedia product and the project uses its customised mediaplayer far better than stagefright,that player selection is done in MediaPlayerService.cpp(android is also doing same thing).But instead of STAGEFRIGHT_PLAYER it is creating its own player.If i remove that customisation then m3u8 file(android uses NU Player for playing m3u8 file) is not playing because of crash happening in OMX software decoder.Now how can i change this omx software decoder to some other decoder which can play m3u8 file. – Mayank Agarwal Feb 17 '14 at 08:16
  • @MayankAgarwal.. In your `media_codecs.xml` file, ensure that your preferred decoder is the first one to be present in the file for a given `mime` type. This way, there will not be fallback to the software decoder. Usually, the flow falls back to SW decoder only if it couldn't find any other codec type. Which codec are you having an issue with?? – Ganesh Feb 17 '14 at 08:27
  • Is there any way i can select the decoder component at application level and give it to omxcodec.cpp through jni – Mayank Agarwal Feb 17 '14 at 08:40
  • @MayankAgarwal.. Are you using `MediaCodec` or `MediaPlayer`? If you are using `MediaPlayer`, you can't configure the component i.e. `OMXCodec` below. By standard `AOSP` methodology, if your codec is the first one in the list of codecs, it should get selected. If you want to really customize, then you can try one experiment. In your codec's entry in `media_codecs.xml`, add a new custom mime-type as another role and while creating `MediaCodec`, provide this mime type. This will help to ensure that your preferred codec is only always created. – Ganesh Feb 18 '14 at 03:58