5

I'm trying to capture my built-in webcam with openCV in C++, do some precessing. This ist working so far.

Now i want to stream the webcam to the browser. How can I archieve that?

  • Should I create a WebSocket? Or use a UPD Socket?
  • How can I display that content in the Browser? Is that possible with HTML5 and JS?

Thank you.

dab0bby
  • 2,951
  • 1
  • 31
  • 35
  • in the browser, it is possible to use websockets and connect to a live stream. the biggest question, i guess, would be to run the actual streaming server and decide on a compatible video format. i have never tried to do it by myself, but this question got me really interested and i have found this interesting discussion http://stackoverflow.com/questions/21921790/best-approach-to-real-time-http-streaming-to-html5-video-client – user151496 Feb 18 '16 at 10:51
  • @user151496 it's a pretty interesting topic, I've been able to stream from `webcam(html5) > server > video(html5)` over web sockets, audio was a major issue as capturing the stream has very limited support and in general hard to work with. OP should take a look at **MPEG-DASH**, it was much easier to get up and going than web sockets. – 8eecf0d2 Feb 18 '16 at 11:06

2 Answers2

2

I may be a little late, but as I didn't find a completely updated solution for C++ and mjpeg in StackOverflow, thought about writing a new answer.

There are now some good and simple libraries for the task in C++ (c++ mjpg streaming to html)

https://github.com/nadjieb/cpp-mjpeg-streamer

https://github.com/jacksonliam/mjpg-streamer

https://github.com/codewithpassion/mjpg-streamer/tree/master/mjpg-streamer

I found the first one to be very simple. You need CMake, and make installed in the system.

git clone https://github.com/nadjieb/cpp-mjpeg-streamer.git;
cd cpp-mjpeg-streamer;
mkdir build && cd build;
cmake ../;
make;
sudo make install;
  • Make sure you have the correct version of OpenCV installed.

Now, write the streamer:

mjpeg_server.cc

#include <opencv2/opencv.hpp>

#include <nadjieb/mjpeg_streamer.hpp>

// for convenience
using MJPEGStreamer = nadjieb::MJPEGStreamer;

int main()
{
    cv::VideoCapture cap;
    cap.open("demo.mp4"); 
    if (!cap.isOpened())
    {
        std::cerr << "VideoCapture not opened\n";
        exit(EXIT_FAILURE);
    }

    std::vector<int> params = {cv::IMWRITE_JPEG_QUALITY, 90};

    MJPEGStreamer streamer;

    // By default 1 worker is used for streaming
    // if you want to use 4 workers:
    //      streamer.start(8080, 4);
    streamer.start(8000);

    // Visit /shutdown or another defined target to stop the loop and graceful shutdown
    while (streamer.isAlive())
    {
        cv::Mat frame;
        cap >> frame;
        if (frame.empty())
        {
            std::cerr << "frame not grabbed\n";
            //continue;
            exit(EXIT_FAILURE);
        }

        // http://localhost:8080/bgr
        std::vector<uchar> buff_bgr;
        cv::imencode(".jpg", frame, buff_bgr, params);
        streamer.publish("/bgr", std::string(buff_bgr.begin(), buff_bgr.end()));

        cv::Mat hsv;
        cv::cvtColor(frame, hsv, cv::COLOR_BGR2HSV);

        // http://localhost:8080/hsv
        std::vector<uchar> buff_hsv;
        cv::imencode(".jpg", hsv, buff_hsv, params);
        streamer.publish("/hsv", std::string(buff_hsv.begin(), buff_hsv.end()));

        // std::cout<< "published" << std::endl;
    }

    streamer.stop();
}

Write the CMakeLists.txt

cmake_minimum_required(VERSION 3.1)

project(mjpeg_streamer CXX)

find_package(OpenCV 4.2 REQUIRED)
find_package(nadjieb_mjpeg_streamer REQUIRED)

include_directories(${OpenCV_INCLUDE_DIRS})

add_executable(stream_test
  "mjpeg_server.cc")
target_compile_features(stream_test PRIVATE cxx_std_11)
target_link_libraries(stream_test PRIVATE nadjieb_mjpeg_streamer::nadjieb_mjpeg_streamer
                     ${OpenCV_LIBS})


| --- mjpeg_server.cc
| --- CMakeLists.txt
| --- ...
| --- build  
      | --- demo.mp4
      | --- ...

Now, we can build the streamer.

mkdir build && cd build;
cmake ../;
make;
./stream_test

Now, if you go to "http://ip_address:port/bgr" or, "http://ip_address:port/hsv" you should be able to see the stream. In my case, ip = 192.168.1.7 / localhost, port = 8000.

If you want to grab the stream with another server,

index.html

<html>
  <body>
    <img src="http://localhost:8000/bgr">
    <img src="http://localhost:8000/hsv">
  </body>
</html>

serve.py

import http.server
import socketserver

class MyHttpRequestHandler(http.server.SimpleHTTPRequestHandler):
    def do_GET(self):
        if self.path == '/':
            self.path = 'index.html'
        return http.server.SimpleHTTPRequestHandler.do_GET(self)

# Create an object of the above class
handler_object = MyHttpRequestHandler

PORT = 8080
my_server = socketserver.TCPServer(("", PORT), handler_object)

# Star the server
my_server.serve_forever()

python3 serve.py

Finally, even though it's extremely simple, it's not secure.

Zabir Al Nazi
  • 10,298
  • 4
  • 33
  • 60
1

So I found a solution myself. The concept ist like this:

My server is a WebSocket-Server build with the POCO Library.

Server:
In the main thread initialize the camera (the camera has to be initialized in the main thread). After a WebSocket connection is established the server captures a frame from cv::VideoCapture, converts the frame to JPEG, and encodes the image to a Base64string and finally sends that string back to the client.

Browser:
In the browser the recieved Base64 string can be interpreted as an image by the img tag.

<img id="image" src="" width="1280" height="720"/>  

ws.onmessage = function(evt)
{
  $("#image").attr('src',  'data:image/jpg;base64,'+ evt.data);
};

So if the server now sends 30 frames within a second, there a smooth livestream in the browser.

dab0bby
  • 2,951
  • 1
  • 31
  • 35