0

I have been working on a c++ command line tool to record screen. After some searching I have come up with this following code. Looks like screen is being recorded when I compile and run the code. I am looking for functions where I can provide the specific filepath where the screen record is to be stored. Also I would like to append the timestamp along with filename. If anybody has better approach or method to this problem please suggest here. Any leads are appreciated. Thanks

#include <ApplicationServices/ApplicationServices.h>

int main(int argc, const char * argv[]) {
    // insert code here...

    CGRect mainMonitor = CGDisplayBounds(CGMainDisplayID());
    CGFloat monitorHeight = CGRectGetHeight(mainMonitor);
    CGFloat monitorWidth = CGRectGetWidth(mainMonitor);
    const void *keys[1] = { kCGDisplayStreamSourceRect };
    const void *values[1] = { CGRectCreateDictionaryRepresentation(CGRectMake(0, 0, 100, 100)) };

    CFDictionaryRef properties = CFDictionaryCreate(NULL, keys, values, 1, NULL, NULL);

    CGDisplayStreamRef stream = CGDisplayStreamCreate(CGMainDisplayID(), monitorWidth, monitorHeight, '420f' , properties,  ^(CGDisplayStreamFrameStatus status, uint64_t displayTime, IOSurfaceRef frameSurface, CGDisplayStreamUpdateRef updateRef){});

    CGDirectDisplayID displayID = CGMainDisplayID();
    CGImageRef image_create = CGDisplayCreateImage(displayID);

    CFRunLoopSourceRef runLoop = CGDisplayStreamGetRunLoopSource(stream);

   // CFRunLoopAddSource(<#CFRunLoopRef rl#>, runLoop, <#CFRunLoopMode mode#>);

    CGError err = CGDisplayStreamStart(stream);
    if (err == CGDisplayNoErr) {
        std::cout<<"WORKING"<<std::endl;
        sleep(5);
    } else {
        std::cout<<"Error: "<<err<<std::endl;
    }

    //std::cout << "Hello, World!\n";
    return 0;
}

1 Answers1

3

You should do that in the callback which you provide in CGDisplayStreamCreate. You can access the pixels via IOSurfaceGetBaseAddress (see other IOSurface functions). If you don't want to do the pixel twiddling yourself, you could create a CVPixelBuffer with CVPixelBufferCreateWithBytes from the IOSurface and then create a CIImage with [CIImage imageWithCVImageBuffer] and save that to file as seen here.

Rudolfs Bundulis
  • 11,636
  • 6
  • 33
  • 71
  • Could you please elaborate on how can I provide location in the callback in CGDisplayStreamCreate. Also in " CGImageDestinationRef dest = CGImageDestinationCreateWithData(data, CFSTR("public.jpeg"), 1, NULL); " static string is required as destination that means I cant store it to the timestamp based filename. Thanks – Nikhil Nilawar Apr 13 '18 at 07:00
  • @NikhilNilawar I don't quite understand the issue. The callback has a paramter `displayTime` which has the timestamp and you can just create a formatted string with `[NSString stringWithFormat]` with the timestamp and everything you need and pass that string to `CGImageDestinationCreateWithData` – Rudolfs Bundulis Apr 13 '18 at 07:10
  • I now understand how to create a timestamp based filename and location for storing a screencapture using CGImageDestinationCreateWithData. My original question is how to do the same for stream and not just a single screen capture. – Nikhil Nilawar Apr 13 '18 at 08:52
  • @NikhilNilawar but you do not have a stream. If you want a video stream you have to encode the raw frames. – Rudolfs Bundulis Apr 13 '18 at 09:12
  • I am a beginner in Software Development. I thought CGDisplayStreamStart(stream) starts the stream.Can you suggest alternative approach to record the screen and save to a desired location. Any links to the same would be very useful. Thanks. – Nikhil Nilawar Apr 13 '18 at 09:53