3

I have raw 44,1 kHz audio data from a song as Javascript array and I'd like to create a zoomable timeline out of it.

Example timeline from Audacity:

Sample waveform from Audacity

Since there are millions of timepoints normal Javascript graphics libraries probably don't cut it: I think, not sure, that normal graph libraries will die on this many timepoints. But does there exist already libraries for this sort of visualization for JS? Canvas, webGL, SVG all are acceptable solutions.

A solution preferably with zoom and pan.

Note that this happens strictly on client side and server-side solutions are not accetable.

Oskar Eriksson
  • 2,591
  • 18
  • 32
Mikko Ohtamaa
  • 82,057
  • 50
  • 264
  • 435
  • You can't solve this problem simply by throwing your amplitude values into a graphics library and hoping it will deal with it. You need to create "overviews" or "previews" of your data zoomed out. – Bjorn Roche Jul 26 '12 at 20:35
  • Thus, I am asking whether such solution already exist? I am pretty aware that current graphics libraries cant' deal with it. – Mikko Ohtamaa Jul 27 '12 at 07:46
  • Creating overviews is not a difficult task. Such a library may exist, but most of the trouble would lie in getting your data in and out of the library, not actually creating the overviews. – Bjorn Roche Jul 27 '12 at 14:42

3 Answers3

3

I've looked into this same problem pretty extensively. To the best of my knowledge, the only existing project that does close to what you want is wavesurfer.js. I haven't used it, but the screenshots and the description sound promising.

See also this question.

Best of luck.

Community
  • 1
  • 1
dB'
  • 7,838
  • 15
  • 58
  • 101
3

You cannot simply take the the waveform data and render all data points, this is terribly inefficient.

Variable explanation:

  • width: Draw area width in pixels, max is screen width
  • height: Same as width but then height of draw area
  • spp: Samples per pixel, this is your zoom level
  • resolution: Number of samples to take per pixel sample range, tweak for performance vs accuracy.
  • scroll: You will need virtual scrolling for performance, this is the scroll position in px
  • data: The raw audio data array, probably several million samples long
  • drawData: The reduced audio data used to draw

You are going to have to only take the samples that are in the viewport from the audio data and reduce those. Commenly this results in a data set that is 2 * width, you use this data set to render the image. To zoom out increase spp, to zoom in decrease it. Changing scroll value pans it.

The following code has O(RN) complexity where N is width and R is resolution. Maximum accuracy is at spp <= resolution.

The code will look something like this, this gets the peak values, you could do rms or average as well.

let reduceAudioPeak = function(data, spp, scroll, width, resolution) {
    let drawData = new Array(width);
    let startSample = scroll * spp; 
    let skip = Math.ceil(spp / resolution);

    // For each pixel in draw area
    for (let i = 0; i < width; i++) {
        let min = 0; // minimum value in sample range
        let max = 0; // maximum value in sample range
        let pixelStartSample = startSample + (i * spp);

        // Iterate over the sample range for this pixel (spp) 
        // and find the min and max values. 
        for(let j = 0; j < spp; j += skip) {
           const index = pixelStartSample + j;
           if(index < data.length) {
               let val = data[index];
               if (val > max) {
                  max = val;
               } else if (val < min) {
                  min = val;
               }
           }
        }

        drawData[i] = [min, max];
    }
    return drawData;
}

With this data you can draw it like this, you could use lines, svg etc:

let drawWaveform = function(canvas, drawData, width, height) {
   let ctx = canvas.getContext('2d');
   let drawHeight = height / 2;

   // clear canvas incase there is already something drawn
   ctx.clearRect(0, 0, width, height);
   for(let i = 0; i < width; i++) {
      // transform data points to pixel height and move to centre
      let minPixel = drawData[i][0] * drawHeigth + drawHeight;
      let maxPixel = drawData[i][1] * drawHeight + drawHeight;
      let pixelHeight = maxPixel - minPixel;

      ctx.fillRect(i, minPixel, 1, pixelHeight);
   }
} 
David Sherman
  • 320
  • 2
  • 9
  • Can you provide more details how you use this? – Toniq Aug 05 '20 at 22:51
  • @Toniq What details would you like? I made a library for rendering audio segments over multiple tracks you can find here if you're interested in how you would implement it. https://github.com/Idicious/waveshaper – David Sherman Sep 22 '20 at 14:35
0

I have used RaphaelJS for SVG rendering in the browser at it has performed very well. It is what I would go for. Hopefully SVG will be up to the task.