I'm currently spiking out a music application with HTML5/JS and am attempting to achieve the lowest latency I can with the MediaStream Recording API. The app allows a user to record music with a camera and microphone. While the camera and microphone are on, the code will allow the user to hear and see themselves.
At the moment I have:
const stream = await navigator.mediaDevices.getUserMedia(
{
video: true,
audio: {
latency: {exact: 0.003},
}
}
);
// monitor video and audio (i.e. show it to the user)
this.video.srcObject = stream;
this.video.play();
If I go any lower on the latency requirement, I get an OverConstrained error. The latency is okay (better than the default) but still not great for the purposes of hearing yourself while you're recording. There is a slight, perceptible lag from when you strum a guitar and hear it in your headphones.
Are there other optimizations here I can make to achieve better results? I don't care about the quality of the video and audio as much, so maybe lowering resolution, sample rates, etc. could help here?