0

I have a need to inject touch events across application boundaries, without root. The reason is that I have an external touch screen that I'd like to use to "replace" phone screen. Think of the touch screen as just a larger version of the phone screen. I already am casting the screen, and have written a service to capture the touch events on the external screen. Now I'd like to send them to the phone, as if they were sent from the phone. The solution can not need a computer (as the touch screen needs to attach via USB OTG) There is FRep, VNC (non-root), and AutoInput that all seem to do at least some touch events across application boundaries without root. For other reasons, I have to use Jelly Bean or above, so UiAutomation might be my best solution.

I've seen this response (How to inject click event with Android UiAutomation.injectInputEvent) and it looks like it might be just what I need, just I can't get it to work. It just crashes on automation.injectInputEvent(motionDown, true);

Community
  • 1
  • 1

1 Answers1

0

so UiAutomation might be my best solution

Only if your external touchscreen is actually a computer running Windows, OS X, or Linux and has the Android SDK installed.

There is FRep

Which requires a computer running Windows, OS X, or Linux, as it uses a bit of the Android SDK.

VNC (non-root)

Which requires a computer running Windows, OS X, or Linux, as it uses a bit of the Android SDK.

AutoInput

I think this is using the accessibility APIs, though I am not 100% certain.

I've seen this response

Which requires a computer running Windows, OS X, or Linux, as it is "using Robotium and Android JUnit", which in turn requires the Android SDK.

If you are looking to automate input as an ordinary Android app, your only option is to use the accessibility APIs.

CommonsWare
  • 986,068
  • 189
  • 2,389
  • 2,491
  • I'm learning here, so bear with me. The description for UiAutomation states "Class for interacting with the device's UI by simulation user actions... It also allows injecting of arbitrary raw input events simulating user interaction with keyboards and touch devices. One can think of a UiAutomation as a special type of AccessibilityService which does not provide hooks for the service life cycle and exposes other APIs that are useful for UI test automation." So how am I to know to need "a computer running Win/OSX/Linux and has the Android SDK installed."? The API call (I think) I need is there – David Anderson Aug 12 '15 at 20:37
  • @DavidAnderson: "So how am I to know to need "a computer running Win/OSX/Linux and has the Android SDK installed."?" -- from the JavaDocs? You wouldn't. Android's documentation is a bit hit-or-miss. "The API call (I think) I need is there" -- yes, except that an ordinary Android application cannot use it. – CommonsWare Aug 12 '15 at 21:04
  • OK. Been doing some research. It seems like I can capture touch events (I'm already capturing the external touch screen events), but I don't see how I can SEND them. What API is used to SEND an arbitrary touch event? What about SENDING a multi-touch event? – David Anderson Aug 12 '15 at 21:35
  • @DavidAnderson: "What API is used to SEND an arbitrary touch event?" -- as I wrote in my answer, your only option is to use the accessibility APIs. AFAIK, they do not support an arbitrary touch event. "What about SENDING a multi-touch event?" -- ditto. – CommonsWare Aug 12 '15 at 21:36