I'm trying to use DPI and touch input in my unity game to recognise gestures consistently across android devices of different sizes.
For instance, I want my virtual thumbstick to be two inches from "full left" to "full right". Given the DPI of the screen, I can convert touches from pixels to inches like this: inchesFromStickCenter = pixelsFromStickCenter/dpi
.
To get DPI, unity offers Screen.dpi
but I was getting inconsistent results across devices. Sometimes the thumbstick was way too big, sometimes way too small. Instead I went straight to Android's DisplayMetrics, where I could get xdpi
, ydpi
, and densityDpi
. From this question Difference between the declared size in inches and the size obtained using DisplayMetrics I see that densityDpi is a rounded value, and I should probably be using xdpi or ydpi.
I tested things by trying to compute the width of the screen in inches (my is landscape, all these examples assume landscape). When I divide the screen pixel width (1280) by xdpi (195.4) on my 1st gen N7, it overestimates the screen width by half an inch (6.55 inches, compared to just under 6 when measured with a rule). When I divide by densityDpi (213), it's a much better answer. The wikipedia page for the N7 says dpi is 215, which would also give a great answer.
When I test on my Galaxy S2, the xdpi (217) gives a good screen size estimate, and the densityDpi (240) gives an underestimate by a third of an inch.
So I can't depend on either of these numbers! Why is neither N7 stat nothing like the wikipedia page? Is this a silly way of trying to convert pixels to real-world inches? What should I be doing instead?
Cheers!