0

I am trying to output two data buffers into one audio file as separate channels using the Java Sound API. I found ways to output monochannel audio, but that's not really what I am looking for. I also have no idea which audio format I should use (WAV, MP3, etc.). My two data buffers are byte arrays ranging from -127 to +127.

Astavie
  • 91
  • 1
  • 8
  • To clarify, are you playing back the sound via the Java app, or exporting/writing sound files? When you say separate channels, do you wish to put the data from buffer one in stereo left and buffer two into stereo right? Or are you talking about something else? – Phil Freihofner Nov 30 '16 at 21:21

1 Answers1

1

Here's some sample code that shows you how to create a WAV file. MP3 is not really supported out of the box by Java, though there are libraries for that.

import javax.sound.sampled.AudioFileFormat;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import java.io.ByteArrayInputStream;
import java.io.File;
import java.io.IOException;

public class StereoOutput {

    public static void main(final String[] args) throws IOException {

        // left samples - typically more than 4!!
        final byte[] left = new byte[] {1, 2, 3, 4};
        // right samples
        final byte[] right = new byte[] {1, 2, 3, 4};

        final ByteArrayInputStream interleavedStream = createInterleavedStream(left, right);

        // audio format of the stream we created
        final AudioFormat audioFormat = new AudioFormat(
                AudioFormat.Encoding.PCM_SIGNED,
                44100f, // sample rate - you didn't specify, 44.1k is typical
                8,      // how many bits per sample, i.e. per value in your byte array
                2,      // you want two channels (stereo)
                2,      // number of bytes per frame (frame == a sample for each channel)
                44100f, // frame rate
                true);  // byte order
        final int numberOfFrames = left.length; // one frame contains both a left and a right sample
        // wrap stream into AudioInputStream (data + format)
        final AudioInputStream audioStream = new AudioInputStream(interleavedStream, audioFormat, numberOfFrames);
        // write to WAV file
        AudioSystem.write(audioStream, AudioFileFormat.Type.WAVE, new File("out.wave"));
    }

    /**
     * Typically in PCM audio, left and right samples are interleaved.
     * I.e.: LR LR LR LR.
     * One LR is also called a "frame".
     *
     * @param left array with left samples
     * @param right array with right samples
     * @return stream that contains all samples in LR LR interleaved order
     */
    private static ByteArrayInputStream createInterleavedStream(final byte[] left, final byte[] right) {
        final byte[] interleaved = new byte[left.length + right.length];
        for (int i=0; i<left.length; i++) {
            interleaved[2*i] = left[i];
            interleaved[2*i+1] = right[i];
        }
        return new ByteArrayInputStream(interleaved);
    }
}

I recommend working your way through the Java Sound Trail and also check out the Java API docs for classes like AudioFormat. If you're unfamiliar with PCM, read up on that, too. It's essential to understanding digital sampled audio.

Hendrik
  • 5,085
  • 24
  • 56