13

I'm trying to rewrite some (very simple) android code I found written in Java into a static HTML5 app (I don't need a server to do anything, I'd like to keep it that way). I have extensive background in web development, but basic understanding of Java, and even less knowledge in Android development.

The only function of the app is to take some numbers and convert them into an audio chirp from bytes. I have absolutely no problem translating the mathematical logic into JS. Where I'm having trouble is when it gets to actually producing the sound. This is the relevant parts of the original code:

import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;

// later in the code:

AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRate, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, minBufferSize, AudioTrack.MODE_STATIC);

// some math, and then:

track.write(sound, 0, sound.length); // sound is an array of bytes

How do I do this in JS? I can use a dataURI to produce the sound from the bytes, but does that allow me to control the other information here (i.e., sample rate, etc.)? In other words: What's the simplest, most accurate way to do this in JS?

update

I have been trying to replicate what I found in this answer. This is the relevant part of my code:

window.onload = init;
var context;    // Audio context
var buf;        // Audio buffer

function init() {
if (!window.AudioContext) {
    if (!window.webkitAudioContext) {
        alert("Your browser does not support any AudioContext and cannot play back this audio.");
        return;
    }
        window.AudioContext = window.webkitAudioContext;
    }

    context = new AudioContext();
}

function playByteArray( bytes ) {
    var buffer = new Uint8Array( bytes.length );
    buffer.set( new Uint8Array(bytes), 0 );

    context.decodeAudioData(buffer.buffer, play);
}

function play( audioBuffer ) {
    var source    = context.createBufferSource();
    source.buffer = audioBuffer;
    source.connect( context.destination );
    source.start(0);
}

However, when I run this I get this error:

Uncaught (in promise) DOMException: Unable to decode audio data

Which I find quite extraordinary, as it's such a general error it manages to beautifully tell me exactly squat about what is wrong. Even more surprising, when I debugged this step by step, even though the chain of the errors starts (expectedly) with the line context.decodeAudioData(buffer.buffer, play); it actually runs into a few more lines within the jQuery file (3.2.1, uncompressed), going through lines 5208, 5195, 5191, 5219, 5223 and lastly 5015 before erroring out. I have no clue why jQuery has anything to do with it, and the error gives me no idea what to try. Any ideas?

Community
  • 1
  • 1
yuvi
  • 18,155
  • 8
  • 56
  • 93
  • update: I tried working in this solution http://stackoverflow.com/questions/24151121/how-to-play-wav-audio-byte-array-via-javascript-html5 but I get `Uncaught (in promise) DOMException: Unable to decode audio data`, and I really don't understand the basics of bytes translating into sounds to know why – yuvi Mar 30 '17 at 20:12
  • I think you may be looking for the web audio api `[AudioBuffer]`(https://developer.mozilla.org/en-US/docs/Web/API/AudioBuffer). I don't have enough experience with this to provide a complete answer, but I figured I'd comment just in case. – aberkow Mar 31 '17 at 14:06
  • Should `var buffer = new Uint8Array( bytes.length )` be `var buffer = new ArrayBuffer( bytes.length )`? Why is `Uint8Array` necessary? What is `bytes`? – guest271314 Apr 09 '17 at 04:54

2 Answers2

5

If bytes is an ArrayBuffer it is not necessary to create a Uint8Array. You can pass ArrayBuffer bytes as parameter to AudioContext.decodeAudioData() which returns a Promise, chain .then() to .decodeAudioData(), call with play function as parameter.

At javascript at stacksnippets, <input type="file"> element is used to accept upload of audio file, FileReader.prototype.readAsArrayBuffer() creates ArrayBuffer from File object, which is passed to playByteArray.

window.onload = init;
var context; // Audio context
var buf; // Audio buffer
var reader = new FileReader(); // to create `ArrayBuffer` from `File`

function init() {
  if (!window.AudioContext) {
    if (!window.webkitAudioContext) {
      alert("Your browser does not support any AudioContext and cannot play back this audio.");
      return;
    }
    window.AudioContext = window.webkitAudioContext;
  }

  context = new AudioContext();
}

function handleFile(file) {
  console.log(file);
  reader.onload = function() {
    console.log(reader.result instanceof ArrayBuffer);
    playByteArray(reader.result); // pass `ArrayBuffer` to `playByteArray`
  }
  reader.readAsArrayBuffer(file);
};

function playByteArray(bytes) {
  context.decodeAudioData(bytes)
  .then(play)
  .catch(function(err) {
    console.error(err);
  });
}

function play(audioBuffer) {
  var source = context.createBufferSource();
  source.buffer = audioBuffer;
  source.connect(context.destination);
  source.start(0);
}
<input type="file" accepts="audio/*" onchange="handleFile(this.files[0])" />
guest271314
  • 1
  • 15
  • 104
  • 177
  • 1
    I'm not loading any audio file, I'm creating a byte sequence out of a calculation. Anyway, I solved it myself. First I managed to produce sound when I figured out that since it's a 16bit sound, I actually needed to use a 32FloatArray (2 16bit channels). I had to do an additional step, since then the sound was distorted, and with a help of a knowledgeable friend we together figured out some issue with the calculation itself (but that's besides the point of actually producing the sound). – yuvi Apr 14 '17 at 15:08
  • Anyway, I appreciate the effort you put in, so +1 your answer – yuvi Apr 14 '17 at 15:10
  • @yuvi Can you post your solution at an Answer? See http://stackoverflow.com/help/self-answer – guest271314 Apr 14 '17 at 15:14
  • 1
    Yeah I plan on doing that this weekend, when I find the time – yuvi Apr 14 '17 at 15:15
3

I solved it myself. I read more into the MDN docs explaining AudioBuffer and realized two important things:

  1. I didn't need to decodeAudioData (since I'm creating the data myself, there's nothing to decode). I actually took that bit from the answer I was replicating and it retrospect, it was entirely needless.
  2. Since I'm working with a 16 Bit PCM stereo, that meant I needed to use the Float32Array (2 Channels, each 16 Bit).

Granted, I still had a problem with some of my calculations that resulted in a distorted sound, but as far as producing the sound itself, I ended up doing this really simple solution:

function playBytes(bytes) {
    var floats = new Float32Array(bytes.length);

    bytes.forEach(function( sample, i ) {
        floats[i] = sample / 32767;
    });

    var buffer = context.createBuffer(1, floats.length, 48000),
        source = context.createBufferSource();

    buffer.getChannelData(0).set(floats);
    source.buffer = buffer;
    source.connect(context.destination);
    source.start(0);
}

I can probably optimize it a bit further - the 32767 part should happen before this, in the part where I'm calculating the data, for example. Also, I'm creating a Float32Array with two channels, then outputting one of them cause I really don't need both. I couldn't figure out if there's a way to create one channel mono file with Int16Array, or if that's even necessary\better.

Anyway, that's essentially it. It's really just the most basic solution, with some minimal understanding on my part of how to handle my data correctly. Hope this helps anyone out there.

yuvi
  • 18,155
  • 8
  • 56
  • 93