Here is the scenario:
I am recording an audio file using cordova's media plugin and saving the recorded file as myRecordingPath in the documents directory of the app. The fileSystem has following path:
*/var/mobile/Applications/B2EA8890-E5AA-4273-83C4-EB4CA045EA/Documents/ 2192014125156.wav*
Now I am loading the same recorded file in a different view to play the recording using Cordova's media plugin (again). It plays fine.
I also wanted to display the waveform of the same file so I am using Chris Wilson's audio-buffer-draw library. I have tested out the same in chrome but in my application, its not working.
I am trying to pass the recorded file as follows:
var audioSource = 'file://'+myRecordingPath; function initAudio() { var audioRequest = new XMLHttpRequest(); audioSource.crossOrigin='anonymous'; audioRequest.open("GET", audioSource, true); audioRequest.responseType = "arraybuffer"; audioRequest.onload = function() { audioContext.decodeAudioData( audioRequest.response, function(buffer) { var canvas = document.getElementById("wave"); drawBuffer( canvas.width, canvas.height, canvas.getContext('2d'), buffer ); } ); } audioRequest.send(); }
The audio is playing fine but the waveform is not being loaded in the canvas.
Is there something to do with relative path?
I have even used WaveSurfer.js and Shore.js but both of them are using similar concept and yielding same results. Result is: Waveform is not being created.
Please guide.