Definitely Android has some reliable way to determine what the user meant: a click or a scroll/movement. Are there any simple rules that would be enough to build the same behaviour?
Here is my problem.
I have a control that should be either clicked and run its own click handler, or scrolled (tap and move) and make the whole view scrolling (I have a custom view that scrolls in both directions, here is the idea if that matters). So I've set a simple touch listener that remembers the point in onTouch on MotionEvent.ACTION_DOWN and calls the click handler on MotionEvent.ACTION_UP if the point hasn't changed.
That worked fine in the emulator because mouse clicks are precise, so I haven't noticed any danger, and I didn't experience any problems on my Samsung Galaxy S Plus—perhaps I'm really accurate when tapping the screen.
However my first beta testers complained that the buttons don't work—not 100% but most times, so they needed to tap the button many times until it handles the click. Some of them also told me that the buttons work slightly better if touching the screen with the very tip of the finger.
After some thinking, I've assumed that they might move the finger a bit while clicking, so the points on DOWN and UP differed, and the handler didn't determine a click. To verify this, I've added a simple improvement that allows a small drift—and this helped! The testers confirmed that the trick makes things much better. Here is the final code of the handler:
// This interface is for a real click handler that may differ depending on control.
interface Clicker {
boolean click(View v);
}
class ButtonTouchListener implements OnTouchListener {
private boolean _trackingButtonClick = false;
private Clicker _clicker;
// These two integers are the improvement.
private int downX = 0;
private int downY = 0;
public ButtonTouchListener(Clicker clicker) {
_clicker = clicker;
}
public boolean onTouch(View v, MotionEvent event) {
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
downX = (int) event.getRawX();
downY = (int) event.getRawY();
trackingButtonClick = true;
// Calling the onTouchEvent of the main view that handles the global scrolling.
return onTouchEvent(event);
case MotionEvent.ACTION_MOVE:
case MotionEvent.ACTION_OUTSIDE:
int newX = (int) event.getRawX();
int newY = (int) event.getRawY();
int xOffset = downX - newX;
int yOffset = downY - newY;
// These two lines are the improvement also.
// 10 is a max move distance that I've simply guessed.
if (Math.abs(xOffset) >= 10 || Math.abs(yOffset) >= 10)
_trackingButtonClick = false;
// Calling the onTouchEvent of the main view that handles the global scrolling.
return onTouchEvent(event);
case MotionEvent.ACTION_UP:
if (_trackingButtonClick) {
_trackingButtonClick = false;
// Calling the click handler of the control.
return _clicker.click((View)v.getTag());
}
_trackingButtonClick = false;
return true;
default :
return onTouchEvent(event);
}
}
}
}
So it looks like I've made a right guess, and Android likely has something similar inside. But does it? Does anybody know? I'm not familiar with any sources of the platform and hope not to dig there, so I'll appreciate if someone shares the experience.