1

For my integrated test I'm working on an application that needs to provide a live stream to a locally hosted website. I've already built a working site that run's on nanohttpd. This application performs also special image processing. Therefore I use JavaCV. The library is working perfectly and all cpp bindings are working too.

My question: How to set up a live stream that can directly be played in a static site hosted by nanohttpd? - I am on the right way?

My code:

init:

private void initLiveStream() throws FrameRecorder.Exception {
    /* ~~~ https://github.com/bytedeco/javacv/issues/598 ~~~ */
    frameRecorder = new FFmpegFrameRecorder("http://localhost:9090", imageWidth, imageHeight, 0);
    frameRecorder.setVideoOption("preset", "ultrafast");
    frameRecorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
    frameRecorder.setAudioCodec(0);
    frameRecorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
    frameRecorder.setFormat("webm");
    frameRecorder.setGopSize(10);
    frameRecorder.setFrameRate(frameRate);
    frameRecorder.setVideoBitrate(5000);
    frameRecorder.setOption("content_type","video/webm");
    frameRecorder.setOption("listen", "1");
    frameRecorder.start();
}

In my CameraView:

@Override
public void onPreviewFrame(byte[] data, Camera camera) {
    Camera.Size size = camera.getParameters().getPreviewSize();
    Frame frame = new AndroidFrameConverter().convert(data, size.width, size.height);
    try {
         if(frameRecorder!=null){
             frameRecorder.record(frame);
         }
     } catch (FrameRecorder.Exception e) {
         e.printStackTrace();
     }
 }

Here is one of the stack traces that ar shown frequently in my search to the solution:

org.bytedeco.javacv.FrameRecorder$Exception: avio_open error() error -111: Could not open 'http://localhost:9090'

I couldn't find any other thread addressing this specific issue.

Thanks in advance

EDIT

Thanks to Chester Cobus, Here is my used code:

Websocket:

//Constructor
AsyncHttpServer serverStream = new AsyncHttpServer();
List<WebSocket> sockets = new ArrayList<>();

//http://stackoverflow.com/a/33021907/5500092
//I'm planning to use more sockets. This is the only uniform expression I found.
serverStream.websocket("/((?:[^/]*/)*)(.*)", new AsyncHttpServer.WebSocketRequestCallback() {
     @Override
     public void onConnected(final WebSocket webSocket, AsyncHttpServerRequest request) {
         String uri = request.getPath();
         if (uri.equals("/live")) {
             sockets.add(webSocket);

             //Use this to clean up any references to your websocket
             webSocket.setClosedCallback(new CompletedCallback() {
                 @Override
                 public void onCompleted(Exception ex) {
                     try {
                         if (ex != null)
                             Log.e("WebSocket", "Error");
                     } finally {
                         sockets.remove(webSocket);
                     }
                 }
             });
         }
     }
});

//Updater (Observer pattern)
@Override
public void updated(byte[] data) {
    for (WebSocket socket : sockets) {
         socket.write(new ByteBufferList(data));
    }
}

Record Acitivy

private long start_time = System.currentTimeMillis();

@Override
public void onPreviewFrame(byte[] data, Camera camera) {
    long now_time = System.currentTimeMillis();
    if ((now_time - start_time) > 250) {
        start_time = now_time;
        //https://forums.xamarin.com/discussion/40991/onpreviewframe-issue-converting-preview-byte-to-android-graphics-bitmap
        Camera.Size size = camera.getParameters().getPreviewSize();
        YuvImage image = new YuvImage(data, ImageFormat.NV21, size.width, size.height, null);
        ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
        image.compressToJpeg(new Rect(0, 0, size.width, size.height), 60, byteArrayOutputStream);
        MainActivity.getWebStreamer().updated(byteArrayOutputStream.toByteArray());
    }
}

JavaScript

var socket;
var imageElement;

/**
 * path - String.Format("ws://{0}:8090/live", Window.Location.HostName)
 * image - HTMLImageElement
 */
function imageStreamer(path, image) {
    imageElement = image;
    socket = new WebSocket(path);

    socket.onmessage = function(msg) {
        var arrayBuffer = msg.data;
        var reader = new FileReader();
        reader.onload = function(e) {
            imageElement.src = e.target.result;
        };
        reader.readAsDataURL(arrayBuffer);
    };
}
Thomas Devoogdt
  • 816
  • 11
  • 16
  • Did you add the Internet permission to your AndroidManifest.xml – Chester Cobus Mar 24 '17 at 20:46
  • Yes, for my nanohttpd server. In included these: CAMERA, ACCESS_WIFI_STATE, INTERNET, RECORD_AUDIO, READ_EXTERNAL_STORAGE, WRITE_EXTERNAL_STORAGE. – Thomas Devoogdt Mar 24 '17 at 21:26
  • I worked with JavaCV before and processing images is slow and over works the device. I know this is rework, but I would use web sockets in Android and web site and just push the bytes of the preview to the web site (web sockets will allow you to do this) this way you do not have a dependency on JavaCV and it will stream much faster because you not creating a video to stream with.Cool app though. https://github.com/koush/android-websockets and https://www.html5rocks.com/en/tutorials/websockets/basics/ – Chester Cobus Mar 24 '17 at 21:47
  • Use this link rather for server web socket on the Android side: https://github.com/koush/AndroidAsync – Chester Cobus Mar 24 '17 at 21:58
  • To answer your question, I am assuming that your site is on a separate machine and your app is on a mobile device, then you cannot connect through http://localhost:9090. You must use an IP Address that's been assigned from Wifi. I do not think accessing over Internet is possible when the mobile device is a server. – Chester Cobus Mar 24 '17 at 22:49

1 Answers1

1

Here's an example of what the web socket implementation will look like:

 //This code must run just before Camera is opened.    

 AsyncHttpServer server = new AsyncHttpServer();

  server.websocket("/live","ws", new WebSocketRequestCallback() {
            @Override
            public void onConnected(final WebSocket webSocket, AsyncHttpServerRequest request) {
                this.webSocket = webSocket //webSocket make it an instance variable
            }
  });

 //listen on port 5000
 server.listen(5000);
 //browsing ws://{IP Address assigned by wifi}:5000/live

@Override
public void onPreviewFrame(byte[] data, Camera camera) {
   this.websocket.send(data);
}

Use Gradle to get the library above

dependencies {
    compile 'com.koushikdutta.async:androidasync:2.+'
}

Here's client code for your site:

  var socket = new WebSocket('ws://{IP Address assigned by wifi}:5000/live', ['soap', 'xmpp']);

  socket.onmessage = function(msg) {
      var arrayBuffer = msg.data;
      var image = document.getElementById('image'); //<img id="image" /> in HTML

      var reader = new FileReader();
      reader.onload = function(e) {
           image.src = e.target.result;
      };
      reader.readAsDataURL(arrayBuffer);
  };
Chester Cobus
  • 701
  • 4
  • 12