6

My database has a lot of short audio that formatted base64. I want to play audio when the button is on clicking. Basically, I wrote this code but it doesn't work. (If it possible, The file had better doesn't writing to storage because this process have a delay)

playButton.setOnClickListener(new View.OnClickListener() {
    @Override
    public void onClick(View v) {

        try{
            String url = "data:audio/mp3;base64,"+base64FormattedString;
            MediaPlayer mediaPlayer = new MediaPlayer();
            mediaPlayer.setDataSource(url);
            mediaPlayer.prepare();
            mediaPlayer.start();
        }
        catch(Exception e){
            e.printStackTrace();
        }

    }
});

And the stacktrace is here: https://gist.github.com/AliAtes/aa46261aba3d755fbbd1eba300356a5f

ATES
  • 281
  • 1
  • 11
  • 29
  • what is `"data:audio/mp3;base64,"`? – pskink Mar 17 '17 at 09:45
  • It is prefix for audio/mp3 file for base64 string. base64FormattedString doesn't have it normally. – ATES Mar 17 '17 at 09:51
  • i mean it is not a valid data source from a view of `MediaPlayer` - `"path String: the path of the file, or the http/rtsp URL of the stream you want to play"` – pskink Mar 17 '17 at 09:57
  • Well, what is right string? – ATES Mar 17 '17 at 10:52
  • "the path of the file, or the http/rtsp URL of the stream you want to play" – pskink Mar 17 '17 at 10:57
  • possible duplicate with http://stackoverflow.com/questions/1972027/android-playing-mp3-from-byte – user1516873 Mar 19 '17 at 12:16
  • there won't be any delay if you create a file with audio beforehand, for example when the user opens your app for the first time, instead of creating the file the moment before playing the sound – nandsito Mar 21 '17 at 13:24
  • the data:... prefix before base64 stream requires a player implementation that is aware of so-called "data url" syntax. Did you verify this? Did you also verify that the base64 stream can correctly be decoded to a .mp3 file? Is the audio codec supported by your player? – Christoph Bimminger Mar 25 '17 at 12:35
  • The datas are already verified because I am converting mp3 to base64. – ATES Mar 25 '17 at 13:33

2 Answers2

7

With your API constraint, you can use AudioTrack. Furthermore, it's pretty easy. With AudioTrackyou can play byte[] audio data since API 3.

You just have to init your AudioTrack object with audio sample particularities:

AudioTrack audioTrack = new AudioTrack(...);

and call:

audioTrack.play();

Then, your decode your Base64 data:

byte[] data = Base64.decode(yourBase64AudioSample, Base64.DEFAULT);

and put it in audioTrack to playback:

int iRes = audioTrack.write(data, 0, data.length);

and TADAAA :)

... Don't forget to release after playing:

audioTrack.release();
N0un
  • 868
  • 8
  • 31
  • Your code is perfect, it is working but I didn't adapt my mp3 into `AudioTrack();` correctly. I need a final help :) (That is the mp3 all informations: https://gist.github.com/AliAtes/be72817aa0ebea45acd6ef6840c279b1) – ATES Mar 24 '17 at 18:54
  • I'll see that tomorrow. You probably have to decode your MP3 file before playing! – N0un Mar 26 '17 at 20:13
  • After a few research, it appears that `AudioTrack` only support PCM (uncompressed audio format) while mp3 is a compressed audio format. The solution is to decode it before playing. Using this link you have an example using `JLayer`: http://mindtherobot.com/blog/624/android-audio-play-an-mp3-file-on-an-audiotrack/ – N0un Mar 27 '17 at 07:15
  • More than 3 months later: do you still consider this answer as a wrong answer? – N0un Jul 04 '17 at 16:05
4

You should be able to achieve this by extending MediaDataSource and calling setDataSource with it as the parameter.

Maybe try something like this:

// first decode the data to bytes
byte[] data = Base64.decode(base64FormattedString, Base64.DEFAULT);
mediaPlayer.setDataSource(new MediaDataSource() {
    @Override
    public long getSize() {
        return data.length;
    }

    @Override
    public int readAt(long position, byte[] buffer, int offset, int size) {
        int length = getSize();
        if (position >= length) return -1; // EOF
        if (position + size > length) // requested more than available
            size = length - position; // set size to maximum size possible
                                      // at given position

        System.arraycopy(data, (int) position, buffer, offset, size);
        return size;
    }

    @Override
    public synchronized void close() throws IOException {

    }
});

For more info on implementing a MediaDataSource, you can check out this article.

Edit: This approach requires 23 as API level. Pre-23 it seems to be nearly impossible to provide the data for playback from a byte[] (or anything but an URL or a file). The only thing I found was this answer, which provides the same code as shown in the question.

Edit 2: N0un has pointed out in his comment and answer, that it is possible since API level 3 by using AudioTrack instead of a MediaPlayer.

Community
  • 1
  • 1
Leon
  • 2,926
  • 1
  • 25
  • 34
  • Thanks for your code but `MediaDataSource` api level is 23. My project api is 19 so is it possible you are planning your code without `MediaDataSource`? – ATES Mar 19 '17 at 13:21
  • 1
    I did some more research and it seems to be impossible to provide data in from of a `byte[]` without a `MediaDataSource`. The only thing that I found was [this answer](http://stackoverflow.com/a/39023706/5189673), but as it provides the same code you are using, it apparently does not work for you. If you want to stick with 19 as your API level you could go with the approach of creating a file (maybe create the files early, e.g. when the app is first started, and check if that helps with the delay). The alternative would be to switch to 23 as your API level. – Leon Mar 19 '17 at 21:16
  • Yes it is possible "pre-23 API". Indeed, it's possible since API 3 :) See my answer: this is the way I used for my app (It plays `byte[]`data received through network using this method, so I confirm it works). – N0un Mar 24 '17 at 13:36