5

I am writing an accessibility service for Android which aims at providing different alternatives for people with physical disabilities to control a device, for instance, using switches, scanning, head tracking, and others.

Currently, to perform the actual actions on the application's interface we use the accessibility API, basically the AccessibilityNodeInfo.performAction() method. This works fine most of the time, but we found some important restrictions:

  • Most keyboards (IME) just do not work. We only had success with the Google keyboard on Lollipop (API 22) and had to use AccessibilityService.getWindows(). For lower API versions we had to develop a special keyboard (undoubtedly not the optimal solution).
  • Most games are not accessible. DOT. They do not export a AccessibilityNodeInfo tree.
  • Web navigation is not practical (no scrolling, among other issues).

A solution would be to use a different API to perform the actions, and it seems that android.app.UiAutomation would fit the purpose. According to the documentation "It also allows injecting of arbitrary raw input events simulating user interaction with keyboards and touch devices" which is what I am looking for; although I understand that UiAutomation is intended for testing purposes (and, perhaps, not ready for production quality code) and that, perhaps, might not behave the same on different devices. I also understand that such an API might be a security hole if any application could use it. But it seems reasonable to allow an accessibility service to use UiAutomation given that AccessibilityNodeInfo.performAction() provides similar "powers".

So, I tried the following inside my accessibility service:

 Instrumentation i = new Instrumentation();
 UiAutomation automation = i.getUiAutomation();

But getUiAutomation() always returns null.

Is there any option to use UiAutomation (or similar API) inside an accessibility service?

BTW: rooting the device is not an option to us so we cannot inject events through the screen driver

Community
  • 1
  • 1
Cesar Mauri
  • 495
  • 6
  • 14
  • 1
    "which aims at providing different alternatives for people with physical disabilities to control a device, for instance, using switches, scanning, head tracking, and others" -- this sounds like it should be a custom ROM, perhaps on custom hardware, rather than trying to do this at the app level. – CommonsWare Jun 19 '15 at 16:31
  • The point is to make regular hardware accessible to those with disabilities, given that we have been able to attach mechanical switches (using the USB port) and implement head tracking the remaining question is being able to perform the actions which, to our understanding, is a software issue. – Cesar Mauri Jun 19 '15 at 16:48
  • Hopefully "the point" is to provide *reliable* "different alternatives for people with physical disabilities to control a device, for instance, using switches, scanning, head tracking, and others". As you have already discovered, Android devices vary. The more complex the interactions you want to support, the more those variances will affect the reliability of the solution. Worse, you would no longer control the updates, and so reliability can change overnight with no control. This even presumes that what you want is possible (AFAIK, it is not). – CommonsWare Jun 19 '15 at 16:53
  • AFAIK, UiAutomation is supposed to be a public API and thus, updates will keep backwards compatibility. Am I wrong? Just being able to inject mouse events would solve many of our problems (of course, not all of them). Could you please explain where is the lack of reliability on this specific case? Just to clarify, the user will be able to see where is pointing and then perform a "click" (or another action). – Cesar Mauri Jun 19 '15 at 17:23
  • "Am I wrong?" -- no, as far as the API goes. "Could you please explain where is the lack of reliability on this specific case?" -- you are assuming that `UiAutomation` behaves the same on all devices and for all apps. Beyond that, you are assuming that `UiAutomation` is designed for production use, as opposed to its stated use ("developing UI test automation tools and libraries"). AFAIK, this is not usable outside of instrumentation testing. – CommonsWare Jun 19 '15 at 17:50
  • OK. I edited my question to reflect that UiAutomation is intended for testing purposes and that might not behave the same on different devices. Thanks. Anyway, the solution we are currently using have many limitations. Therefore, any improvement, though not perfect, that does not involve rooting the device or creating a custom ROM, is welcomed. – Cesar Mauri Jun 19 '15 at 18:54
  • Did you found any solution,I am against a similar requirement? – Zain Ali Sep 03 '15 at 12:49
  • Not yet, but a promising option would be to use the ADB interface to inject events. I tried 'Remote ADB Shell' and I managed to inject events from the very same device. You need to enable USB debugging, plug the device to the computer, enable ADB via TCP IP (adb tcpip 5555) and then you can connect to localhost:5555 and inject these events. The AdbLib library allows to communicate via the adb interface from an app. – Cesar Mauri Sep 03 '15 at 13:57
  • The procedure could be tedious for some users, but once configured you do not need to repeat again until the next reboot. The AdbLib could be found here https://github.com/cgutman/AdbLib – Cesar Mauri Sep 03 '15 at 14:12

1 Answers1

1

I answer my own question.

Since Nougat (API 24), the way to programmatically execute actions on other applications is by using an accessibility service and the AccessibilityService#dispatchGesture method.

Cesar Mauri
  • 495
  • 6
  • 14