8

I have an ImageView that you can use to do a one-finger pan, or a two finger scale. It works fine. I've extended it to handle rotation, and its behaviour is causing some confusion.

When there's a multi-touch event this method gets called, which should return an angle of rotation in degrees. It is not doing what I expect.

private int fingersAngle(MotionEvent event)
{
    float x = event.getX(0) - event.getX(1);
    float y = event.getY(0) - event.getY(1);
    int degrees = (int)(Math.toDegrees(Math.atan2(y, x)));
    return degrees;
}

As I rotate the two fingers I'd expect outputs like...

158 166 168 169 174 176 179 181 etc

But what I actually get is more like this:

158 166 -179 179 -179 179 -179

The problem seems to be with signs. How does this method know whether it's 180 or -180, or 90 or -270? The image often rotates the wrong way, and then suddenly jumps and starts rotating the opposite way. Even worse, the direction it initially rotates is effectively random.

I'm testing the app using a Nexus One, but also see the same problem on an Advent Vega tablet. Both devices work ok with Google Maps in 3D to rotate the screen (if a bit jumpy sometimes) so the evidence doesn't suggest a hardware limitation.

A secondary problem is that when the two fingers are approximately vertically or horizontally aligned the angle simply doesn't change, so the rotation "sticks" for about 10-20 degrees top/bottom/left/right.

Currently I'm doing a check to see if the angle has suddenly changed a huge amount, and if so, to subtract it from 360. Ugly as hell, but it helps a little.

Hopefully one of you has seen this before and knows how to extract the angular rotation from a multi-touch gesture.

Some things in Android are so easy it's amazing, but stuff like this is seriously painful.

Ollie C
  • 28,313
  • 34
  • 134
  • 217
  • I don't know anything about this, but if all you have are 2 points how can you know anything about the rotation? The rotation has to be relative to some point doesn't it? Don't you need to know the coordinates that the motion started (ACTION_DOWN according to the andriod docs), then workout the center point between them, then when you get motion events which describe movement subsequently (and before the fingers are lifted (ACTION_MOVE according to the andriod docs)) you work out the rotation of those around that original centre point... or something along those lines... – Sam Holder Jun 15 '11 at 18:13
  • I'm measuring rotation relative to the angle between the horizontal axis and the start point of the drag. – Ollie C Jun 15 '11 at 18:25
  • @OllieC does [this question](http://stackoverflow.com/questions/4080070/help-to-calculate-atan2-properly) help – Sam Holder Jun 15 '11 at 18:31
  • my suggestion would be to write unit tests which you can feed with values you know and assert the results you expect, then you can ensure you get the function working properly before you use it with the touch events. – Sam Holder Jun 15 '11 at 18:35
  • Don't think so. AFAICS the ImageView rotations works fine with angles of -180 to 180. I tried adding 180 to it, and it made no difference. – Ollie C Jun 15 '11 at 18:37
  • Sam, I'm asking for help with how to approach writing the method. No point writing unit tests until I have a method that has a chance of working. – Ollie C Jun 15 '11 at 18:52
  • @Ollie C, what about [this question](http://www.google.co.uk/url?sa=t&source=web&cd=2&ved=0CB8QFjAB&url=http%3A%2F%2Fstackoverflow.com%2Fquestions%2F1311049%2Fhow-to-map-atan2-to-degrees-0-360&ei=cwT5TZTGOoy4hAeppNyADA&usg=AFQjCNFLnRvK1p4ozpBlWJMJN37JweRd4w&sig2=nA5xVP0bFH4teQMmb3a1zg) – Sam Holder Jun 15 '11 at 19:16
  • the [wikipedia entry](http://en.wikipedia.org/wiki/Atan2) should give you some inputs with know outputs (arctan(1,0)=0, arctan(√3,1)=60 degrees, arctan(0,1)=90, arctan(0,-1)=180) etc) – Sam Holder Jun 15 '11 at 19:26
  • Sam, appreciate the link, but I don't have time to learn extensive amounts of maths, just need to get the project done. I'm hoping someone has seen this before and can tell me the solution, which I expect is easy once you know how. I'm still surprised it's so damn hard to do something so simple. – Ollie C Jun 15 '11 at 19:45
  • @OllieC did you ever get this working? – Jubei Nov 14 '12 at 02:36

4 Answers4

4

To make this work like users expect, you must keep some state.

An arc-tangent will only tell you the angle between two points—an instantaneous snapshot of the system. But you are performing an animation—something that happens over time.

So, I think that you are on the right track with your "ugly" solution that tries to detect large changes in the angle. What you really should be tracking over time is the position of each finger.

Suppose a user is using their thumb and index finger to gesture. You need to make sure that their digits are consistently represented, e.g. the thumb is always represented by (x0, y0), and the index finger by (x1, y1).

It's likely that the device has some rule how each coordinate is reported. For example, maybe it will always report the upper-left coordinate in the "0" slot of the event, and the lower-right coordinate in the "1" slot. But if the user moves their thumb from lower-right to upper-left, you must make sure that you are still treating it as (x1, y1) in your angle formula, even though the device is now reporting it as (x0, y0) in the event. Otherwise, the image will appear to quickly flip 180°, back and forth.

If the device doesn't track fingers for you, you'll have to come up with a heuristic for guessing. A simple, "which of the previous coordinates is closer to the current coordinate" would be a good place to start, but I could see the need for something fancier if the resolution of events is poor.

erickson
  • 265,237
  • 58
  • 395
  • 493
1

I've never met this problem but my guess is that the use of atan2 forces your angles from -180 to +180.

But the rotation you want to apply to your image should be from 0 to 360.

Easiest solution would probably be to add 180 to your degrees variable before returning it.

EDIT : Since you added in the comments that there is randomness involved, I think that depending on the device you are working with you could be encountering the infamous HTC(and others early) multitouch bug : http://www.youtube.com/watch?v=Ds5qZ_3XRzI This shows it for the nexus one but the hero and some motorola had the same problems.

Yahel
  • 8,522
  • 2
  • 24
  • 32
0

atan2() is returning a value in the range (-180..+180).

Given this, does it now seem to be working correctly?

(if you are at +179 degrees, rotating further three degrees will be +182; but that is represented as -178)

Daniel Chisholm
  • 566
  • 5
  • 16
  • I'm not sure that makes any difference. The issue isn't so much from what range the returned value comes from, but the fact that it keeps changing (and "sticks" at the four axes). – Ollie C Jun 15 '11 at 18:14
  • It's hard to assess what it's doing because the behaviour has a random element both when the drag starts (which way) and it then sticks and jumps during the drag. It's very hard to see what's happening. – Ollie C Jun 15 '11 at 18:15
0

Check whether you can read 2 points simultaneously and fully independently. I'm not sure if the Nexus One does. Check this page:

http://developer.android.com/reference/android/content/pm/PackageManager.html

and find out if your target device supports just FEATURE_TOUCHSCREEN_MULTITOUCH or better (ie FEATURE_TOUCHSCREEN_MULTITOUCH_DISTINCT). Probably easiest to just install a test app - I think I used 'multitouch visible test'.

If your device doesn't then there's no fix or workaround. It's a driver limitation, generally caused by crappy choice of touchscreens on the part of penny-pinching manufacturers.

  • As I said above, I can do a two-finger rotate in Google Maps on the Nexus One, so the device does support it. It is jumpy at times, but it does work. Likewise for the tablet, which although cheap works better than the Nexus One. But my app fails to work properly at all on either. – Ollie C Jun 15 '11 at 19:44