2

I have a really basic little command line app that grabs the mouse coordinates the next time the mouse is clicked.

#import <Foundation/Foundation.h>
#import <AppKit/AppKit.h>

CGEventRef myCGEventCallback(CGEventTapProxy proxy, CGEventType type, CGEventRef event, void *refcon) {
    CGFloat displayScale = 1.0f;
    if ([[NSScreen mainScreen] respondsToSelector:@selector(backingScaleFactor)])
    {
        displayScale = [NSScreen mainScreen].backingScaleFactor;
    }

    CGPoint loc = CGEventGetLocation(event);
    CFRelease(event);
    printf("%dx%d\n", (int)roundf(loc.x * displayScale), (int)roundf(loc.y * displayScale) );
    exit(0);
    return event;
}

int main(int argc, const char * argv[]) {
    @autoreleasepool {

        CFMachPortRef eventTap;
        CGEventMask eventMask;
        CFRunLoopSourceRef runLoopSource;
        eventMask = 1 << kCGEventLeftMouseDown;
        eventTap = CGEventTapCreate(kCGSessionEventTap, kCGHeadInsertEventTap,
                                    1, eventMask, myCGEventCallback, @"mydata");

        runLoopSource = CFMachPortCreateRunLoopSource(kCFAllocatorDefault, eventTap, 0);
        CFRunLoopAddSource(CFRunLoopGetCurrent(), runLoopSource,
                           kCFRunLoopCommonModes);
        CGEventTapEnable(eventTap, true);
        CFRunLoopRun();
    }
    return 0;
}

I'm building it with cmake with the following file:

cmake_minimum_required(VERSION 3.0.0)


project (location)

set(CMAKE_C_FLAGS "-arch x86_64 -mmacosx-version-min=10.12 -std=gnu11 -fobjc-arc -fmodules")

This all worked fine until the upgrade to Mojave.

A bit of poking around shows this is down to the latest set of security updates and some hints (except CGEventTapCreate() is not returning null) about settings some values in Info.plist to allow the app to use the accessibility API. But I'm struggling to work out where to put it as I just have a single .m file with the code.

Edit

  • This needs to run as a none root user (company policy)
  • if the only way to get it to ask for permission then it can be extended to be a "GUI" app with a minimal UI

This app is just to grab the upper left hand corner of a region of the screen to feed to a second app that streams that area of screen to a second device. The code for the streamer is common across Win/Linux/MacOS so trying to keep the screen coordinate collection totally separate

hardillb
  • 54,545
  • 11
  • 67
  • 105
  • Are the command line tool and/or the app which runs the tool added to the privacy settings of system prefs? – Willeke Jan 04 '19 at 12:07
  • I have tried manually adding it to the list and it still didn't work. – hardillb Jan 04 '19 at 12:11
  • I tried your code (build in an Xcode project) and it worked after adding Terminal and the tool to the accessibility privacy settings. – Willeke Jan 04 '19 at 13:28
  • Hmm, OK, but that's not a great solution. I want the app to deal with this it's self, even if it has to ask for permission on first run. Also not liking having to add the terminal to the list – hardillb Jan 04 '19 at 13:33
  • I'm assuming this needs to work globally, and not just while your app has focus. – Brad Allred Jan 04 '19 at 23:48
  • It's a one shot deal, it just grabs the next mouse click location and exits – hardillb Jan 05 '19 at 09:53

2 Answers2

1

As you surmise, event taps won't work on Mojave without having accessibility access. From the documentation:

Event taps receive key up and key down events if one of the following conditions is true: The current process is running as the root user. Access for assistive devices is enabled. In OS X v10.4, you can enable this feature using System Preferences, Universal Access panel, Keyboard view.

A GUI app will prompt the user to enable accessibility the first time it's needed, but it looks like a CLI app doesn't do that (which makes sense).

There is no way to enable this programatically or through a script; the user must do it themselves.

Running your tool as root should work - can you enforce that?

Otherwise, you can direct the user to the correct place in System Preferences:

tell application "System Preferences"
    reveal anchor "Privacy_Accessibility" of pane id "com.apple.preference.security"
    activate
end tell

It may be possible using Carbon, if your app isn't sandboxed.

Finally, a quick test shows this is at least possible using IOHID. I shameless borrowed the KeyboardWatcher class from this answer. Then, modified the device type:

[self watchDevicesOfType:kHIDUsage_GD_Keyboard];

into:

[self watchDevicesOfType:kHIDUsage_GD_Mouse];

Finally, my callback looks like this:

static void Handle_DeviceEventCallback (void *inContext, IOReturn inResult, void *inSender, IOHIDValueRef value)
{
    IOHIDElementRef element = IOHIDValueGetElement(value);
    IOHIDElementType elemType = IOHIDElementGetType(element);

    if (elemType == kIOHIDElementTypeInput_Button)
    {
        int elementValue = (int) IOHIDValueGetIntegerValue(value);
        // 1 == down 0 == up
        if (elementValue == 1)
        {
            CGEventRef ourEvent = CGEventCreate(NULL);
            CGPoint point = CGEventGetLocation(ourEvent);
            printf("Mouse Position: %.2f, y = %.2f \n", (float) point.x, (float) point.y);
        }
    }
}

That is really a quick hack job, but it demonstrates this is possible and hopefully you can refine it to your needs.

TheNextman
  • 12,428
  • 2
  • 36
  • 75
  • Additional comment to add: if you did get this working with `NSEvent` and the "allow accessibility access" popup; I for one would find it really strange for a CLI tool to request control of my machine in such a way (unless it runs as root, in which case it doesn't need to ask!) – TheNextman Jan 05 '19 at 05:02
  • You mean "cannot" run as root, right? From your edit: convert to a Cocoa.app (`NSApplication`) and the user should get prompted. Note that accepting the prompt does not grant accessibility access - it simply directs the user to the proper location in System Preferences. The user has to manually add the app to the list. Once you have an `NSApplication`, you can instead try the `addGlobalMonitorForEventsMatchingMask` that @BradAllred proposed, otherwise consider trying the IOKit approach above. – TheNextman Jan 05 '19 at 20:30
  • Yeah, I meant can not, but if it just redirects to the place in the settings then I'm hosed as most of the users don't have admin access to their own machines. – hardillb Jan 05 '19 at 20:52
1

I've found the CGEventTap documentation is out of date beginning with Mojave. Running as root used to act as a bypass for certain entitlements, but in Mojave this was tightened down. One bizarre side effect, as you noticed, is that root can still acquire the mach port for the tap; its just that no events can be read from it. If you try your application without running as root you should get the expected popup asking for permission.

If you do not get the popup, or need to run as root for other purposes, you can manually add your application to the trusted TCC database via SystemPreferences -> Security & Privacy -> Privacy -> Accessibility

settings some values in Info.plist to allow the app to use the accessibility API

I believe you mean adding entitlements (which are also a plist). The entitlement that allows an application to use the Accessibility API is the com.apple.private.tcc.allow entitlement (with a value of kTCCServiceAccessibility). As you can probably guess from the name it is only allowed on Apple signed binaries.

You can add these entitlements to your own app if you disable System Integrity Protection (SIP) and boot the kernel with the option amfi_get_out_of_my_way=1, but I wouldn't recommend it (and certainly any customers of yours wouldn't want to). With just SIP disabled you could manually add an entry to the TCC database to grant privileges, but still wouldn't recommend it.

Possible Alternative

You can use an event monitor:

NSEventMask mask = (NSLeftMouseDownMask | NSRightMouseDownMask | NSOtherMouseDownMask);
mouseEventMonitor = [NSEvent addGlobalMonitorForEventsMatchingMask: mask
                     handler:^(NSEvent *event){
                         // get the current coordinates with this
                         NSPoint coords = [NSEvent mouseLocation];
                         // event cooordinates would be event.absoluteX and event.absoluteY
                         ... do stuff
                     }];

The documentation does mention:

Key-related events may only be monitored if accessibility is enabled or if your application is trusted for accessibility access (see AXIsProcessTrusted).

But I don't think that applies to mouse events.

Community
  • 1
  • 1
Brad Allred
  • 7,323
  • 1
  • 30
  • 49
  • I think you need a running `NSApplication` for `addGlobalMonitorForEventsMatchingMask...` to work. He could package his code up into a Cocoa .app and it should work. Note that this is probably the same reason he's not getting the popup (and yes, Apple really needs to update the docs for this stuff....) – TheNextman Jan 05 '19 at 04:59