Yesterday, I had a question about the noteOn method of the AudioContext object.
I've gotten myself all turned around now on this AudioContext object.
Here's what I've tried and their associated error messages in Safari on my desktop:
var ctx
//…
I'm receive raw float32 audio through websockets and would like to playback this in the browser. From my understanding I would need to to use MediaStream API for this. However, I cannot find a way to create a MediaStream which I can append data…
I want to do a live sound analysis on the iPhone. Therefor I use
the webkitAudioContext Analyser.
var ctx = new (window.AudioContext || window.webkitAudioContext);
var audioGoodmorning = new Audio('assets/sounds/greeting.m4a');
var audioSrc =…
I'm trying to use AudioContext in my typescript file for an Angular 5 app. It works great on Chrome, doesn't work on Safari. Everything I see by googling says to use window.webkitAudioContext but that immediately blows up when the typescript…
I've been struggling with an elusive audio distortion bug using webkitAudioContext in HTML5 under iOS 6. It can happen in other circumstances, but the only way I can get 100% repro is on the first visit to my page after power cycling the device. …
I am attempting to downsample the sample rate i am getting from audioContext. I believe it is coming in at 44100, and i want it to be 11025. I thought i could just average every 3 samples and it plays back at the correct rate, but the pitch of is…
Is there a way to record the audio data that's being sent to webkitAudioContext.destination?
The data that the nodes are sending there is being played by the browser, so there should be some way to store that data into a (.wav) file.
I'm trying to use the RecorderJS library (https://github.com/mattdiamond/Recorderjs) which requires me to have an AudioContext. However, when declaring the AudioContext at the very beginning of my script, I am getting an error in the console on page…
I'm having troubles creating an AudioContext with Safari (desktop and mobile). It seems that even with creation upon user interaction, it is still suspended.
My code:
const test = () => {
…
I'm currently working on an application that uses the AudioContext api to control audio for both video clips and background audio. We would like to use the AudioContext (and therefore MediaElementAudioSourceNodes) so we can make adjustments to the…
I want to use WebAudio(AudioContext) in NodeJS. However NodeJS does not support WebAudio. There is an npm for web-audio-api but it is still in alpha stage and is incomplete.
So how can I use WebAudio(AudioContext) in NodeJS.
Can I instantiate a…
It seems that AudioContext.createMediaStreamDestination() defaults to mono output. This default is being changed, but is there a way to set the number of desired output channels manually?
Or is there any other way to get the stream from WebAudio…
This appears to be a common question - Javascript Web Audio API AnalyserNode Not Working - But I can't be sure if I've found an edge case with my implementation.
I create an audio source using createMediaElementSource(), but instead of using an…
I am making a website that plays mp3 audio and then fades out after X seconds. I had this working using a regular audio tag implementation by manipulating the volume at an interval, but this solution doesn't work on iOS because volume is a readonly…
I'm currently attempting to integrate a custom audio filter on a video player handling both HLS and raw MP4 files. I'm having little to no issue integrating it on Chrome and Firefox - Safari on the other hand is not behaving accordingly. I've…