This question is probably not even going to be asked correctly, but I promise you I'm doing my best. Here's the scenario: I wrote a little Java app that receives an audio stream from a server. Now, when I redirect the binary stream to a file, and pipe that file to mplayer, the audio is played correctly. What I want now though is to play the audio from my own app. Here's what I got so far:
The codec mplayer uses to play the stream: AUDIO: 22050 Hz, 2 ch, s16le, 0.0 kbit/0.00% (ratio: 0->88200) Selected audio codec: [ffaac] afm: ffmpeg (FFmpeg AAC (MPEG-2/MPEG-4 Audio))
What I coded so far:
public class StreamPlayer {
public final AudioFormat audioFormat;
public final DataLine.Info info;
public final SourceDataLine soundLine;
public StreamPlayer() throws LineUnavailableException {
audioFormat = new AudioFormat(22050, 16, 2, true, true);
info = new DataLine.Info(SourceDataLine.class, audioFormat, 1500);
soundLine = (SourceDataLine) AudioSystem.getLine(info);
}
public void startSoundLine() throws LineUnavailableException {
soundLine.open(audioFormat);
soundLine.start();
}
public void playStream(byte[] buffer) {
soundLine.write(buffer, 0, buffer.length);
}
}
and I call the playStream function for every received packet. No errors are reported, and no sound is heard. Am I even close to doing this, or off by a long shot?
P.S. I found some third party libraries on google, but I'd really like to keep them as a last resort.
Thank you!