9

Is there any body of evidence that we could reference to help determine whether a person is using a device (smartphone/tablet) with their left hand or right hand?

My hunch is that you may be able to use accelerometer data to detect a slight tilt, perhaps only while the user is manipulating some sort of on screen input.

The answer I'm looking for would state something like, "research shows that 90% of right handed users that utilize an input mechanism tilt their phone an average of 5° while inputting data, while 90% of left handed users utilizing an input mechanism have their phone tilted an average of -5°".

Having this data, one would be able to read accelerometer data and be able to make informed decisions regarding placement of on screen items that might otherwise be in the way for left handed users or right handed users.

Cody Gray - on strike
  • 239,200
  • 50
  • 490
  • 574
Jason D.
  • 390
  • 3
  • 11
  • 2
    Excellent question! The precise way in which one swipes across the screen might also give it away. You'd need to handle or consciously ignore lots of edge cases, e.g. using phone in bed. You can probably even detect when someone's in bed: device wants to be landscape, but is locked to portrait. – chiastic-security Dec 31 '14 at 11:08
  • 2
    I don't know how relevant this is but check these out. [link 1](http://ux.stackexchange.com/questions/36104/should-you-optimize-mobile-experiences-based-on-individual-handedness) [link 2](http://www.uxmatters.com/mt/archives/2013/02/how-do-users-really-hold-mobile-devices.php) – rakeshbs Dec 31 '14 at 11:11
  • Thank you for the links. I agree that the question shouldn't be about handedness per se, but it is about which hand is currently doing the input, and in that respect is sort of about proximate handedness. – Jason D. Dec 31 '14 at 11:20
  • Also, anyone else find it ironic that the ux matters site (link 2 above) took care to ensure the share button shows up in the fixed bottom right corner, but the site is not mobile friendly? – Jason D. Dec 31 '14 at 11:22

2 Answers2

11

You can definitely do this but if it were me, I'd try a less complicated approach. First you need to recognize that not any specific approach will yield 100% accurate results - they will be guesses but hopefully highly probable ones. With that said, I'd explore the simple-to-capture data points of basic touch events. You can leverage these data points and pull x/y axis on start/end touch:

touchStart: Triggers when the user makes contact with the touch surface and creates a touch point inside the element the event is bound to.

touchEnd: Triggers when the user removes a touch point from the surface.

Here's one way to do it - it could be reasoned that if a user is left handed, they will use their left thumb to scroll up/down on the page. Now, based on the way the thumb rotates, swiping up will naturally cause the arch of the swipe to move outwards. In the case of touch events, if the touchStart X is greater than touchEnd X, you could deduce they are left handed. The opposite could be true with a right handed person - for a swipe up, if the touchStart X is less than touchEnd X, you could deduce they are right handed. See here:

enter image description here

enter image description here

Here's one reference on getting started with touch events. Good luck!

http://www.javascriptkit.com/javatutors/touchevents.shtml

Collarbone
  • 570
  • 7
  • 17
  • 1
    As pointed out, no technique will work 100% of the time, but I think this is an excellent answer. I realize I probably should not have suggested the domain of the answer within the question itself. I would suggest, as I swipe back through your answer as I'm commenting, that the touch start "x" value might serve as a secondary guess. Since I'm holding the phone with my left hand and swyping my comment with my index, I'm also swiping back through your answer using my index finger, which does not create the arch, but does start on the right side of the screen. – Jason D. May 14 '15 at 11:51
  • 2
    That's a fair point. Another approach you could take to be sure is to measure the surface area of the touch object (in this case the finger) by tracking radiusX, radiusY. The thumb takes considerable more surface area than the pointer finger - by actual size and by the amount of finger toughing the screen. While you're right, the angle will be different using a finger of a hand not holding the phone than the thumb of the hand holding the phone, naturally the arch is there there - but to a lesser degree. Let me know if that works out for you. – Collarbone May 14 '15 at 22:57
  • 1
    Testing to see what I do with my right-handed thumb, I move in a fairly straight vertical pattern but I do so close to the right edge of the screen, because my phone is too large for me to even viably reach the left edge with my thumb. This seems like it would be true of all devices I'd use except very small ones... then it falls apart. Which is fine, you can just improve your detection by asking physical screen size first. http://stackoverflow.com/questions/21680629/getting-the-physical-screen-dimensions-dpi-pixel-density-in-chrome-on-androi – Chris Moschini Apr 06 '17 at 21:44
  • I honestly hoped there will be some built-in property in browser accessible right off the first render, like `navigator.language` but for handedness lol. Naive guy – Jerry Green Nov 27 '20 at 10:33
4

There are multiple approaches and papers discussing this topic. However, most of them are written between 2012-2016. After doing some research myself I came across a fairly new article that makes use of deep learning. What sparked my interest is the fact that they do not rely on a swipe direction, speed or position but rather on the capacitive image each finger creates during a touch.

Highly recommend reading the full paper: http://huyle.de/wp-content/papercite-data/pdf/le2019investigating.pdf

Whats even better, the data set together with Python 3.6 scripts to preprocess the data as well as train and test the model described in the paper are released under the MIT license. They also provide the trained models and the software to run the models on Android.

Git repo: https://github.com/interactionlab/CapFingerId

Niklas
  • 1,142
  • 16
  • 33