2

in this example (answer): How to get MFCC with TarsosDSP?

they show how to use MFCC in android @Test from float array, Im trying to use it with data from microphone :

    int sampleRate = 44100;
    int bufferSize = 8192;
    int bufferOverlap = 128;
    final AudioDispatcher dispatcher = AudioDispatcherFactory.fromDefaultMicrophone(sampleRate, bufferSize, bufferOverlap);
    final MFCC mfcc = new MFCC(bufferSize, sampleRate, 40, 50, 300, 3000);
    dispatcher.addAudioProcessor(mfcc);
    dispatcher.addAudioProcessor(new AudioProcessor() {

        @Override
        public void processingFinished() {
        }

        @Override
        public boolean process(AudioEvent audioEvent) {
            float[] audioBuffer = audioEvent.getFloatBuffer();
            textView.setText(Arrays.toString(audioBuffer));
            return true;
        }
    });
    dispatcher.run();

I want to print the output but this is not printing anything and I cant debugg it! I need help with it.

EDIT:

after some struggle, I have changned the code to : (but still not working)

    int sampleRate = 44100;
    int bufferSize = 8192;
    int bufferOverlap = 128;
    final AudioDispatcher dispatcher = AudioDispatcherFactory.fromDefaultMicrophone(sampleRate, bufferSize, bufferOverlap);
    final MFCC mfcc = new MFCC(bufferSize, sampleRate, 40, 50, 300, 3000);
    dispatcher.addAudioProcessor(mfcc);
    dispatcher.addAudioProcessor(new AudioProcessor() {

        @Override
        public void processingFinished() {
            float audio_float[] = mfcc.getMFCC();
            textView.setText(Arrays.toString(audio_float));

        }

        @Override
        public boolean process(AudioEvent audioEvent) {
            mfcc.process(audioEvent);
            final float audio_float[] = mfcc.getMFCC();
            //textView.setText(Arrays.toString(audio_float));
            textView.setText("TESTING");
            runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    textView.setText(Arrays.toString(audio_float));

                }
            });
            return true;
        }
    });
    new Thread(dispatcher,"Audio MFCC").start();

EDIT: done:

    dispatcher.addAudioProcessor(new AudioProcessor() {

        @Override
        public void processingFinished() {
            textView.setText("Finish");
        }

        @Override
        public boolean process(AudioEvent audioEvent) {

            mfcc.process(audioEvent);
            final float audio_float[] = mfcc.getMFCC();
            runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    textView.setText(Arrays.toString(audio_float));
                    textView.invalidate();
                }
            });
            return true;
        }
    });
    new Thread(dispatcher,"Audio MFCC").start();
}
Atheel Massalha
  • 483
  • 1
  • 4
  • 10
  • I think this link involves every answer related to `Audio_Processing` whether it is *Pre-Processing* or *Post-Processing*: [Android_Audio_Processing_Using_WebRTC](https://github.com/mail2chromium/Android-Audio-Processing-Using-WebRTC), You can also visit this reference: https://stackoverflow.com/a/58546599/10413749 – Muhammad Usman Bashir Apr 07 '20 at 08:47

0 Answers0