1

I have developed a system in which a C# program receives sound buffers (byte arrays) from another subsystem. It is supposed to play the incoming buffers continuously. I searched in the web and I decided to use SoundPlayer. It works perfectly in the offline mode (play the buffers after receiving them all). However, I have a problem in the real-time mode.

In the real-time mode the program at first waits for a number of buffer arrays (for example 200) to receive and accumulate them. Then it adds a wav header and plays it. However, after that for each next 200 arrays it plays repeatedly the first buffer.

I have read following pages:

Play wav/mp3 from memory

https://social.msdn.microsoft.com/Forums/vstudio/en-US/8ac2847c-3e2f-458c-b8ff-533728e267e0/c-problems-with-mediasoundplayer?forum=netfxbcl

and according to their suggestions I implemented my code as follow:

public class MediaPlayer
{
    System.Media.SoundPlayer soundPlayer;

    public MediaPlayer(byte[] buffer)
    {
        byte[] headerPlusBuffer = AddWaveHeader(buffer, false, 1, 16, 8000, buffer.Length / 2); //add wav header to the **first** buffer
        MemoryStream memoryStream = new MemoryStream(headerPlusBuffer, true);
        soundPlayer = new System.Media.SoundPlayer(memoryStream);
    }

    public void Play()
    {
        soundPlayer.PlaySync();
    }

    public void Play(byte[] buffer)
    {
        soundPlayer.Stream.Seek(0, SeekOrigin.Begin);
        soundPlayer.Stream.Write(buffer, 0, buffer.Length);
        soundPlayer.PlaySync();
    }
}

I use it like this:

MediaPlayer _mediaPlayer;
if (firstBuffer)
{
    _mediaPlayer = new MediaPlayer(dataRaw);
    _mediaPlayer.Play();
}
else
{
    _mediaPlayer.Play(dataRaw);
}

Each time _mediaPlayer.Play(dataRaw) is called, the first buffer is played again and again; the dataRaw is updated though.

I appreciate your help.

Community
  • 1
  • 1
Yasser Mohseni
  • 453
  • 2
  • 7
  • 17
  • 1
    When you have an API that's not designed for streaming playback, there's no good way to add streaming in a wrapper. Passing it a sequence of blocks will not achieve the correct time between blocks. You need an API designed for streaming. – Ben Voigt Jan 20 '17 at 17:05
  • Thank you Ben. So what streaming playback API do you recommend? – Yasser Mohseni Jan 20 '17 at 17:12
  • 2
    The one I've used is Media Foundation, but that's far too complex for your needs. Can you use a library? Try nAudio, you'll need the `BufferedWaveProvider` and `WaveOut` classes. – Ben Voigt Jan 20 '17 at 17:23

0 Answers0