I have a requirement to generate audio on the fly (generating from wav or mp3 files is not an option). Luckily the new WebAudio API (FF4 and Chrome 13) provides this functionality.
I have some Java code i'm porting to Javascript that looks like this:
byte[] buffer = new byte[]{ 56, -27, 88, -29, 88, -29, 88, -29 ............ };
AudioFormat af = new AudioFormat(44100, 16, 1, true, false);
SourceDataLine sdl = AudioSystem.getSourceDataLine(af);
sdl.open(af, 1470 * 4); //create 4 frame audio buffer
sdl.start();
sdl.write(buffer, 0, buffer.length);
I'm trying to get this working with the Web Audio API, but it is extremely distorted. Here is the code i'm using in JS:
var buffer = [ 56, -27, 88, -29, 88, -29, 88, -29 ............ ];
var ctx = new webkitAudioContext();
var src = ctx.createBufferSource();
src.buffer = ctx.createBuffer(1, buffer.length, 44100);
src.buffer.getChannelData(0).set(buffer);
src.connect(ctx.destination);
src.looping = false;
src.noteOn(0);
Here is a .java file that i'm testing with: http://gwt-nes-port.googlecode.com/svn/wiki/webAudioTests/Main.java
And here is the .js file that i'm testing with: http://gwt-nes-port.googlecode.com/svn/wiki/webAudioTests/Main.js
Any tips on the differences between how javax.sound and Web Audio work, and what is causing the distortion in my JS code?