7

I have and C# Xamarin android app that hosts a reactjs app in a webview.

When using this app on a touch screen android device, It appears that occasionally tapping the screen is ignored.

What appears to be going on is that, the tap is interpreted as a mini drag event, as there was some small directional movement in the tap.

Looking at the android logs, for failed taps, I noticed output like the following:

adb -d logcat -s CustomFrequencyManagerService

06-19 13:35:49.225  2945  9989 D CustomFrequencyManagerService: acquireDVFSLockLocked : type : DVFS_MIN_LIMIT  frequency : 839000  uid : 1000  pid : 2945  pkgName : GESTURE_DETECTED@CPU_MIN@49
06-19 13:35:49.781  2945  2945 D CustomFrequencyManagerService: releaseDVFSLockLocked : Getting Lock type frm List : DVFS_MIN_LIMIT  frequency : 839000  uid : 1000  pid : 2945  tag : GESTURE_DETECTED@CPU_MIN@49

Note the GESTURE_DETECTED part of the log entry.

However for successful taps, CustomFrequencyManagerService has no output in the log.

Looking at this from the reactjs app perspective:

I noticed that the failed taps emit the following events:

touchstart
touchend

While the normal successful events are:

touchstart
touchend
mousedown
blur
mouseup
click

I could potentially change the reactjs app to respond directly to touch events instead of click events, but I was wondering if there was a way (hopefully programmatically via android app) to alter the sensitivity with regard to what's interpreted as a drag as opposed to a click?

By installing a IOnTouchListener on the Android.WebKit.WebView

_webView.SetOnTouchListener(new GestureIgnoreTouchListener());

I was able to see at what movement threshold a click turned into a drag.

    public class GestureIgnoreTouchListener : Java.Lang.Object, Android.Views.View.IOnTouchListener
    {
        float _x;
        float _y;

        public bool OnTouch(Android.Views.View v, MotionEvent e)
        {
            if (e.Action == MotionEventActions.Down)
            {
                _x = e.RawX;
                _y = e.RawY;
                
                return false;
            }
            if (e.Action == MotionEventActions.Up)
            {
                var diffX = e.RawX - _x;
                var diffY = e.RawY - _y;

                var distance = Math.Sqrt(Math.Pow(diffX, 2) + Math.Pow(diffY, 2));
                // observed: 
                // if distance is 10 or less then this is interpreted as a click.
                // if distance is 12 or greater, click is not emitted.
                Console.WriteLine(distance);

                return false;
            }

            return false;
        }
    }

Ideally, if the distance was between 10 and 50, I would like to be able to make this be considered a click not a drag. Possibly I could create a synthetic click event, in this case, but I'm hoping I can somehow influence what ever android code is responsible for interpreting this as a drag.

Tom
  • 6,325
  • 4
  • 31
  • 55
  • You could set `OnDragListener` and check if it was not a drag, pass the event to `OnClickListener` – M D P Jul 06 '20 at 19:05
  • (thanks for the suggestion) Only the TouchListener appears to work with webview. The Drag + Click ones never appear to be invoked. (I see other SO posts that suggest the same think - https://stackoverflow.com/questions/3600017/setonclicklistener-not-response-on-android-webview) – Tom Jul 06 '20 at 19:46
  • Have you tried creating the drag/swipe gesture recognizer and assigning a threshold value? This might make anything less than the assigned value be interpreted as a tap. hopefully – Alex Pappas Jan 19 '21 at 02:28

1 Answers1

1

There are two approaches I've seen people use for this situation. The first you already mentioned: tell react when you're in touch screen environment to use tap events instead of click events.

The second is to take into account what Android refers to as "touch slop":

From https://developer.android.com/training/gestures/movement.html:

Because finger-based touch isn't always the most precise form of interaction, detecting touch events is often based more on movement than on simple contact. To help apps distinguish between movement-based gestures (such as a swipe) and non-movement gestures (such as a single tap), Android includes the notion of "touch slop". Touch slop refers to the distance in pixels a user's touch can wander before the gesture is interpreted as a movement-based gesture. For more discussion of this topic, see Managing Touch Events in a ViewGroup.

Google provides an example of one way to deal with it in a difference context here: https://developer.android.com/training/gestures/viewgroup#vc which you could probably adapt to your situation.

lfalin
  • 4,219
  • 5
  • 31
  • 57