The question (or implementation in answer to the question) really consists of two parts, which are:
- How to send audio stream from browser in Javascript
- How to receive audio stream on server in C/C++
This is because sending data over the network only loosely couples the client and the server when they use the same protocol. You could write a server in C++, then write two different clients that communicate with it, one in Javascript, then also a desktop app written in Java.
Javascript on Client Side
For the client side, sending audio from the browser in Javascript should follow the normal libraries available for WebRTC; the WebRTC site has some useful information on this, including a video streaming example here (https://webrtc.github.io/samples/)
Some of the links which might be of interest on that page:
- Audio-only getUserMedia() output to local audio element
- Stream from a video element to a video element
There are some StackOverflow answers already about WebRTC and audio in javascript, here are a couple, these (and libraries) will be more plentiful than C++ questions on the topic:
- Sending video and audio stream to server
- Sending a MediaStream to host Server with WebRTC after it is captured by getUserMedia
For the C++ Server:
The WebRTC site has a link to the Native API for the libraries here (https://webrtc.org/native-code/native-apis/) and an excellent simple example of a peer connection WebRTC server using them is here (https://webrtc.googlesource.com/src/+/master/examples/peerconnection). It also has an implementation of a C++ client there, which may help in testing the server to get it working first, or see the general principles.