11

I want to combine audio clips, layered on top of each other so that they play synchronously and are saved in a new audio file. Any help would be much appreciated. I've done some digging online, but couldn't find a definitive answer as to whether or not many of the tools available as far as Javascript audio editing librarys go (Mix.js for example) are capable.

miken32
  • 42,008
  • 16
  • 111
  • 154
Ethan Stepanian
  • 141
  • 1
  • 6
  • 1
    Is it just a one time thing? Does it need to be done programmatically? Why preferably JS? – dgo Nov 13 '16 at 03:35
  • Also - if the answers to my above questions are 'yes', 'no', ... Then use a [DAW](https://en.m.wikipedia.org/wiki/Digital_audio_workstation). Plenty of freeware and open source options available. For what you want to do, pretty easy, low learning curve. – dgo Nov 13 '16 at 16:06
  • After you answer questions above .... is this from a browser or nodejs ? – Scott Stensland Nov 13 '16 at 20:30
  • The dev I was planning on delegating this to was most comfortable using javascript, but that’s no longer a priority. This needs to be done programmatically. If it can be done on the front-end, that would be best, but I understand the solution might need to be on the backend. – Ethan Stepanian Nov 15 '16 at 15:49

1 Answers1

11

Yes, it is possible using OfflineAudioContext() or AudioContext.createChannelMerger() and creating a MediaStream. See Phonegap mixing audio files , Web Audio API.

You can use fetch() or XMLHttpRequest() to retrieve audio resource as an ArrayBuffer, AudioContext.decodeAudioData() to create an AudioBufferSourceNode from response; OfflineAudioContext() to render merged audio, AudioContext, AudioContext.createBufferSource(), AudioContext.createMediaStreamDestination() , MediaRecorder() to record stream; Promise.all(), Promise() constructor, .then() to process asynchronous requests to fetch(), AudioContext.decodeAudioData(), pass resulting mixed audio Blob at stop event of MediaRecorder.

Connect each AudioContext AudioBufferSourceNode to OfflineAudioContext.destination, call .start() on each node; call OfflineAudioContext.startRendering(); create new AudioContext node, connect renderedBuffer; call .createMediaStreamDestination() on AudioContext to create a MediaStream from merged audio buffers, pass .stream to MediaRecorder(), at stop event of MediaRecorder, create Blob URL of Blob of recorded audio mix with URL.createObjectURL(), which can be downloaded using <a> element with download attribute and href set to Blob URL.

var sources = ["https://upload.wikimedia.org/wikipedia/commons/b/be/"
               + "Hidden_Tribe_-_Didgeridoo_1_Live.ogg"
               , "https://upload.wikimedia.org/wikipedia/commons/6/6e/" 
               + "Micronesia_National_Anthem.ogg"];

var description = "HiddenTribeAnthem";
var context;
var recorder;
var div = document.querySelector("div");
var duration = 60000;
var chunks = [];
var audio = new AudioContext();
var mixedAudio = audio.createMediaStreamDestination();
var player = new Audio();
player.controls = "controls";

function get(src) {
  return fetch(src)
    .then(function(response) {
      return response.arrayBuffer()
    })
}

function stopMix(duration, ...media) {
  setTimeout(function(media) {
    media.forEach(function(node) {
      node.stop()
    })
  }, duration, media)
}

Promise.all(sources.map(get)).then(function(data) {
    var len = Math.max.apply(Math, data.map(function(buffer) {
      return buffer.byteLength
    }));
    context = new OfflineAudioContext(2, len, 44100);
    return Promise.all(data.map(function(buffer) {
        return audio.decodeAudioData(buffer)
          .then(function(bufferSource) {
            var source = context.createBufferSource();
            source.buffer = bufferSource;
            source.connect(context.destination);
            return source.start()
          })
      }))
      .then(function() {
        return context.startRendering()
      })
      .then(function(renderedBuffer) {
        return new Promise(function(resolve) {
          var mix = audio.createBufferSource();
          mix.buffer = renderedBuffer;
          mix.connect(audio.destination);
          mix.connect(mixedAudio);              
          recorder = new MediaRecorder(mixedAudio.stream);
          recorder.start(0);
          mix.start(0);
          div.innerHTML = "playing and recording tracks..";
          // stop playback and recorder in 60 seconds
          stopMix(duration, mix, recorder)

          recorder.ondataavailable = function(event) {
            chunks.push(event.data);
          };

          recorder.onstop = function(event) {
            var blob = new Blob(chunks,  {
              "type": "audio/ogg; codecs=opus"
            });
            console.log("recording complete");
            resolve(blob)
          };
        })
      })
      .then(function(blob) {
        console.log(blob);
        div.innerHTML = "mixed audio tracks ready for download..";
        var audioDownload = URL.createObjectURL(blob);
        var a = document.createElement("a");
        a.download = description + "." + blob.type.replace(/.+\/|;.+/g, "");
        a.href = audioDownload;
        a.innerHTML = a.download;
        document.body.appendChild(a);
        a.insertAdjacentHTML("afterend", "<br>");
        player.src = audioDownload;
        document.body.appendChild(player);
      })
  })
  .catch(function(e) {
    console.log(e)
  });
<!DOCTYPE html>
<html>

<head>
</head>

<body>
  <div>loading audio tracks.. please wait</div>
</body>

</html>

You can alternatively utilize AudioContext.createChannelMerger(), AudioContext.createChannelSplitter()

var sources = ["/path/to/audoi1", "/path/to/audio2"];    
var description = "mix";
var chunks = [];
var channels = [[0, 1], [1, 0]];
var audio = new AudioContext();
var player = new Audio();
var merger = audio.createChannelMerger(2);
var splitter = audio.createChannelSplitter(2);
var mixedAudio = audio.createMediaStreamDestination();
var duration = 60000;
var context;
var recorder;
var audioDownload;

player.controls = "controls";

function get(src) {
  return fetch(src)
    .then(function(response) {
      return response.arrayBuffer()
    })
}

function stopMix(duration, ...media) {
  setTimeout(function(media) {
    media.forEach(function(node) {
      node.stop()
    })
  }, duration, media)
}

Promise.all(sources.map(get)).then(function(data) {
    return Promise.all(data.map(function(buffer, index) {
        return audio.decodeAudioData(buffer)
          .then(function(bufferSource) {
            var channel = channels[index];
            var source = audio.createBufferSource();
            source.buffer = bufferSource;
            source.connect(splitter);
            splitter.connect(merger, channel[0], channel[1]);
            return source
          })
      }))
      .then(function(audionodes) {
        merger.connect(mixedAudio);
        merger.connect(audio.destination);
        recorder = new MediaRecorder(mixedAudio.stream);
        recorder.start(0);
        audionodes.forEach(function(node) {
          node.start(0)
        });

        stopMix(duration, ...audionodes, recorder);

        recorder.ondataavailable = function(event) {
          chunks.push(event.data);
        };

        recorder.onstop = function(event) {
          var blob = new Blob(chunks, {
            "type": "audio/ogg; codecs=opus"
          });
          audioDownload = URL.createObjectURL(blob);
          var a = document.createElement("a");
          a.download = description + "." + blob.type.replace(/.+\/|;.+/g, "");
          a.href = audioDownload;
          a.innerHTML = a.download;
          player.src = audioDownload;
          document.body.appendChild(a);
          document.body.appendChild(player);
        };
      })
  })
  .catch(function(e) {
    console.log(e)
  });
guest271314
  • 1
  • 15
  • 104
  • 177
  • Hi, I have an angular project with typescript and it says: Property 'createMediaStreamDestination' does not exist on type 'AudioContext'. – oihi08 Jan 26 '18 at 08:38
  • @oihi08 At which browsers did you try the code and get the error? – guest271314 Jan 26 '18 at 08:41
  • The error appears when I tried to compile code, but, magically, now works... thanks! – oihi08 Jan 26 '18 at 08:51
  • Sorry, it appears again. It appeared when I try to compile code: ERROR in C:/Users/sm133/Documents/PROYECTOS/panelDeGestion/panelDeMedios/src/app/pages/login/login.component.ts (58,30): Pro perty 'createMediaStreamDestination' does not exist on type 'AudioContext'. – oihi08 Jan 26 '18 at 08:53
  • ie does not support `.createMediaStreamDestination()` https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/createMediaStreamDestination#Compatibility_table – guest271314 Jan 26 '18 at 08:57
  • I would like to do this (overlay two audios) in NodeJS, but if it isn't possible, I am trying to do this on frontend – oihi08 Jan 26 '18 at 08:57
  • I am using chrome, but the errors appeared in the console when I ng serve. I am using Angular4 – oihi08 Jan 26 '18 at 08:58
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/163930/discussion-between-oihi08-and-guest271314). – oihi08 Jan 26 '18 at 09:00
  • Hi, I have a problem with this. In safari it doesn't play – oihi08 Feb 13 '18 at 15:26
  • In Safari it needs to be activated by a user interaction like a button press – ayunami2000 Jul 03 '18 at 17:11
  • Also use webkit prefix eg. webkitAudioContext – ayunami2000 Jul 03 '18 at 17:12
  • I have a js game and I want to record my voice using the microphone and also record any audio from the game itself. Is it possible to do that? In the end I want a video stream with the game + audio of microphone + audios from game. – savram Dec 07 '21 at 23:45