I have a UWP project and I want to use the Windows.Media.Audio API to play a file. Instead of using the FileInputNode, I want to stream my file so I can precisely determine various timing properties.
I found the MediaStreamSource API and made the following code in an attempt to decode a 16 bit PCM 2 channel .wav File
public async Task<Windows.Storage.Streams.Buffer> GetBuffer()
{
// check if the sample requested byte offset is within the file size
if (byteOffset + BufferSize <= mssStream.Size)
{
inputStream = mssStream.GetInputStreamAt(byteOffset);
// create the MediaStreamSample and assign to the request object.
// You could also create the MediaStreamSample using createFromBuffer(...)
MediaStreamSample sample = await MediaStreamSample.CreateFromStreamAsync(inputStream, BufferSize, timeOffset);
sample.Duration = sampleDuration;
sample.KeyFrame = true;
// increment the time and byte offset
byteOffset += BufferSize;
timeOffset = timeOffset.Add(sampleDuration);
return sample.Buffer;
}
else
{
return null;
}
}
Instead of using the Event system, I made a method that is fired whenever my AudioFrameInputNode needs a new AudioFrame.
Now it seems that the resulting byte array in the MediaStreamSample is exactly the same as when I simply read out my StorageFile using a DataReader.
Does MediaStreamSample.CreateFromStreamAsync actually decode the audiofile into a float byte array? Or is this done in the MediaElement when it plays back the sample?
And if so, how can I decode an audiofile so I can supply the resulting AudioBuffer back in to my FrameInputNode?