0

Background

I work on an app that has a floating UI shown over other apps (using SAW permission, AKA "display over other apps").

It also handles touch events that might come from the sides.

The problem

It seems that on some devices, when gesture navigation is enabled, touching on side sides might prioritize the system navigation instead of the app.

What I've tried

I want to change the UI so that at least it will be larger and makes it easier to touch, but only if the current configuration of the OS is on gesture navigation.

Sadly, all gesture-related questions that I've found are related to when you are inside an Activity already, offering to handle the protected areas. Nothing I've found is related to simple detection, let alone without View/Activity/Window.

The question

Given only the bare basic classes that are outside of Activity/View, how can I detect (including a callback when it changes) whether the device is on gesture navigation or not?

android developer
  • 114,585
  • 152
  • 739
  • 1,270
  • Placing controls too close to the edge is just bad UX and neverending stream of problems. There are manufactuer's edge-menus like Samsung and LGs, there is touch rejection on curved screens. No matter how many cases you handle, bugs will keep coming. – Agent_L Oct 19 '21 at 11:36
  • You can drag it wherever you wish (including dragging to the other side). And this is not my design. I'm a developer that this project was given to, to work on. It's one of the main features of it, that started way before edge-menus and curved screens. I don't think it's going away anytime soon. – android developer Oct 20 '21 at 07:46

0 Answers0