0

I am from Android development background so knows very less about webapps. I am working on a document detection functionality which i have achieved on Mobile app side using the following library project.

https://github.com/ctodobom/OpenNoteScanner

In the code Custom Camera Api is being used which provides the real-time camera frames which are then passed to openCv. From there openCv detects the document edges and returns the points which are used to draw the detected edges on a canvas.

The Problem here is we need the same functionality in WebApp.So please suggest the following.

  1. Can we create custom camera using Camera Api in webApps.?
    if, yes then

  2. How to get the real-time camera frames in webApps

What i have found so far.

Following is the link to a source code which opens a camera in mobile web browser. It is using WebRtc. But i am not able to find a way to grab the frames.

https://github.com/apal21/stream-user-video-from-device-webrtc

Nitesh
  • 3,868
  • 1
  • 20
  • 26
  • Will you be streaming the frames to server for processing? – zindarod Sep 05 '17 at 12:29
  • Nop, For document detection, i will be using openCV at client end. Yet need to figure out the same as well. – Nitesh Sep 05 '17 at 12:32
  • Well OpenCV can be built with `gstreamer` support. Gstreamer can open your camera and capture frames in real time. You can use it to stream frames locally or to the server. https://stackoverflow.com/questions/25810640/gstreamer-tcpserversink-v0-10-vs-1-0-and-html5-video-tag https://coaxion.net/blog/2013/10/streaming-gstreamer-pipelines-via-http/ https://gist.github.com/tetkuz/0c038321d05586841897 – zindarod Sep 05 '17 at 12:38

0 Answers0