I would like to build a service that allows a user to listen to a call live from their browser.
I have some experience with Asterisk and this seems to be flexible enough to do what I have described.
Node.js sounds good because it is purported to handle concurrency well, and, I like JavaScript.
In the browser I figure that the HTML5 audio tag, since it handles playing from a streaming source, would be fine to play the sound.
A colleague of mine worked together a demo of this concept using Icecast, but was not able to finish it. There were also significant latency isssues.
My question is this:
How should I go about getting started on this?
Any help is appreciated!
Update:
I found a presentation discussing implementing SIP on top of WebSockets via a SIP proxy on the backend:
http://sip-on-the-web.aliax.net/
Once I have this up and running, the next step would be implementing the streaming. It seems like I should be able to proxy the audio output that would normally go to the sip client, through a secondary server that then streams it to the browser. I wonder why this couldn't be done all in memory? Then there is no need to write and read the file as the call proceeds.