I am trying to write a simple application which plays sound and can alter the volume of that sound at any time during playing. I am doing this by converting each pair of bytes in the byte array of the sound into an int, then multiplying that int by increase or decrease in volume and then writing them back as two bytes (i.e. 1 sample). However, this results in extreme distortion in the sound. Is it possible that I have got the bit shifting wrong? My sound format is:
.wav 44100.0hz, 16bit, little-endian
At the moment the byte array that I pass the adjustVolume method represents a 10th of a second of audio data. i.e. sampleRate/10
Is there something I am missing here that is causing it to distort and not scale volume properly? Have I got the writing of bytes back and fort wrong?
private byte[] adjustVolume(byte[] audioSamples, double volume) {
byte[] array = new byte[audioSamples.length];
for (int i = 0; i < array.length; i += 2) {
// convert byte pair to int
int audioSample = (int) (((audioSamples[i + 1] & 0xff) << 8) | (audioSamples[i] & 0xff));
audioSample = (int) (audioSample * volume);
// convert back
array[i] = (byte) audioSample;
array[i + 1] = (byte) (audioSample >> 16);
}
return array;
}
This code is based off of: Audio: Change Volume of samples in byte array in which the asker is trying to do the same thing. However, having used the code from his question (which I think was not updated after he got his answer) I can't get it to work and I am not exactly sure what it is doing.