I'm developing an iOS application which is sharing it's screen with another device through socket connection. On the other device I handle the touch events and send the screen coordinates to the iOS app but I don't know how to inject them in order to do something. For example if I hold my finger on the screen on the other device and I send the screen coordinates I want the iOS application to act as if I hold my finger on it too. There's no need to do it on system-wide, only on my application.
In Android I used sendPointerSync
but here I searched some other questions and I can't understand if it is possible on iOS too and if it is possible - how?