1

I am working on a tablet (HP) with Windows 8.1. We developed a web application, accessed from the tablet with the Chrome browser, which accesses the tablet's webcam using the getUserMedia API (the implementation is simple, based on JavaScript, similar to the one here for example: https://davidwalsh.name/demo/camera.php). Our application will be used to take photos of identity cards, and then submit them to a servlet.

The quality of the picture taken inside the browser, using the getUserMedia API, is quite poor, and the letters on the identity cards are sometimes not easily readable in the image. If I use the "Camera" application from Windows 8.1 on the same tablet, and take pictures of the same identity cards, in the same light conditions and from the same distance, the resulting images (JPEG) are very clear.

Why is this difference in quality? I read all about the getUserMedia API, and I tried all the available parameters (constraints, width, height, jpeg quality), but I cannot obtain a good quality image. What is the reason for which the same camera on the same tablet results in such a quality difference when used in the browser, and when used with the Windows camera application, and is there a way to obtain better quality in the browser (develop a custom plugin)?

cris23
  • 94
  • 5
  • I think 1 of the reason is that, at the moment, the getUserMedia API takes raw photos and the "Camera" application does configure the camera lens and such just before a photo is taken for optimal quality. There is also a new draft for a new API `takePhoto()` see https://www.w3.org/TR/image-capture/ – Walle Cyril Jun 06 '16 at 11:44
  • Thanks for the answer. I found the PhotoCam control from Primefaces (http://www.primefaces.org/showcase/ui/multimedia/photoCam.xhtml), which uses Flash instead of HTML5 (if you set the forceFlash flag to true). The quality I got with the Flash approach is much better than the one I got with the HTML5 approach. – cris23 Jun 08 '16 at 08:23

1 Answers1

0

To answer your question "Why is this difference in quality?" In short it is because the Browser emulates the camera feed and does image transformation underthehood to allow the ability to send different streams to different clients. WebRTC main focus is for P2P media streaming, not for taking high quality photos.

You can use ImageCapture to gain more camera properties (and to get it as an ImageBitmap) but support right now is still very weak.

Please read my answers below which goes into more depth on MediaCapture and ImageCapture for their use cases for more information.

Why the difference in native camera resolution -vs- getUserMedia on iPad / iOS?

Take photo when the camera is automatically focused

Marcus
  • 1,880
  • 20
  • 27