3

I am trying to create an application similar to the Google Nexus Camera feature Panorama/Photosphere. I want to display predefined 3D sherical points with camera preview. I have used Sensor Fusion Demo for displaying the points. But I am not able to achieve the stability that Google Nexus Camera application's feature Panorama/Photosphere provides when device is in motion.

Assuming a sphere around the device , I am trying to display points that are on equator. When I start the application the preview seems correct but as I start to rotate the equator of the sphere tilts at some angle if I am using only Gyroscope(TYPE_GYROSCOPE(in Android)) data to rotate. When using Rotation vector(TYPE_ROTATION_VECTOR(in Android)) the points are not totally stable and tilting problem occurs here also.

I have tried every other sensor fusion that I could think of. I cannot understand what I am missing or there is something else that should be considered.

Can anyone help me to decode/understand what Google Nexus Camera application's feature Panorama/Photosphere does ???

  • Without knowing too much details about Google camera application and the problematics involved it's possible they calculate some features out of the live camera preview and use them for calibration also. Somewhat similarly as image stitching post processing for panorama images does. – harism Jul 28 '16 at 12:13
  • I think you are confused between **google camera application** and **Google Nexus camera application**. They both provide panorama feature but nexus's panorama is very different and i think you are taking about the first one. – Pikanshu Kumar Jul 28 '16 at 12:34
  • @PikanshuKumar Have you found any solution? – Nominalista May 08 '19 at 08:21

0 Answers0