I have a problem with creation of AudioInputStream from Socket. Here are the important parts:
public class SoundStream extends Thread {
private int port;
private String IP;
private Socket socket;
private SoundObject soundObject;
private OpenAL openAL;
private Source source;
private boolean run = true;
public SoundStream(int port, String IP, SoundObject soundObject) {
this.soundObject = soundObject;
this.port = port;
this.IP = IP;
}
public void run() {
try {
this.socket = new Socket(this.IP, this.port);
this.openAL = new OpenAL();
} catch (Exception e) {
e.printStackTrace();
}
this.mainCycleMethod();
}
private void mainCycleMethod() {
while (run) {
this.soundObject.blockAndWait();
switch (this.soundObject.getAndResetEvent()) {
case 0:
this.run = false;
this.close();
break;
case 1:
this.setPitch();
break;
case 2:
this.closeSource();
this.play();
break;
case 3:
this.pause(true);
break;
case 4:
this.pause(false);
break;
}
}
}
private BufferedInputStream getInputStream() throws Exception {
return new BufferedInputStream(socket.getInputStream());
}
private void setPitch() {
if(this.source != null) {
try {
this.source.setPitch(this.soundObject.getPitch());
} catch (ALException e) {
e.printStackTrace();
}
}
}
private void play() {
try {
AudioInputStream audioInputStream = new AudioInputStream(this.getInputStream(), this.soundObject.getAudioFormat(), AudioSystem.NOT_SPECIFIED);
// AudioInputStream audioInputStream_tmp = AudioSystem.getAudioInputStream(this.getInputStream());
// AudioInputStream audioInputStream = AudioSystem.getAudioInputStream(this.soundObject.getAudioFormat(), audioInputStream_tmp);
this.source = openAL.createSource(audioInputStream);
this.source.setGain(1f);
this.source.play();
} catch (Exception ex) {
ex.printStackTrace();
}
}
private void close() {
this.closeSource();
this.openAL.close();
try {
this.socket.close();
} catch (IOException e) {
e.printStackTrace();
}
}
private void closeSource() {
if(this.source!=null) {
this.source.close();
}
}
private void pause(boolean pause) {
if(this.source != null) {
try {
if (pause) {
this.source.pause();
} else {
this.source.play();
}
} catch (ALException ex) {
ex.printStackTrace();
}
}
}
}
public class SoundObject extends AbstractEventObject {
public AudioFormat getAudioFormat() {
boolean signed = false;
//true,false
boolean bigEndian = false;
//true,false
return new AudioFormat(this.frequency, this.bits, this.channels, signed, bigEndian);
}
.
.
.
.
}
This code throws UnsupportedAudioFileException at this line:
AudioInputStream audioInputStream_tmp = AudioSystem.getAudioInputStream(this.getInputStream());
However when I use this code:
AudioInputStream audioInputStream = new AudioInputStream(this.getInputStream(), this.soundObject.getAudioFormat(), 100000);
it plays the sound but only after it loads those 100000 sample frames to the audioinputstream. After it plays all the 100000 frames it finishes.
I guess that I would solve this issue if I could pass the AudioFormat directly as a parameter during the first AudioInputStream inicialization, but it doesn't seem to be possible. I'm receiving the audio format specifications from server.
I think that one possible solution would be to create a dataline which I can pass as a parametr to AudioInputStream constructor. However I'm not sure how to get the data from the socket directly to dataline. I know of a solution that uses infinite loop, in which it reads the data and writes them to the dataline. But it seems to be wasteful. Is there a more direct approach?
I hope it's possible to solve using java-openAL library, because I need to change the speed and I hope I won't have to do it myself.
Thanks